Access Ignores Excel data type

R

Ray S.

I've read a lot of posts regarding trouble importing data from Excel into an
Access table, but nothing seems to work.

The offending data is text. I have formatted the cells in Excel as text.
I'm importing it into a pre-existing table in Access where I have made the
field tata type text.
Most of the data looks like numbers, but it is not. Occasionally a bit of
data will be obviously text, but Access ignores both the data type of the
Excel cells and the data type of the Access field and just gives me a type
conversion error and doesn't import the offending data.
The only solution I have found is to cut the entire range of cells from
Excel and paste them into Notepad, then re-copy them back into Excel. Isn't
there a less cumbersome way to avoid the data errors and get the data
transferred?
 
B

Bill Edwards

The following might help:
http://support.microsoft.com/kb/109376/en-us
http://support.microsoft.com/kb/208414/en-us

I have gotten around a similar problem by placing a single apostrophe in all
the cells for that column.

I have also gotten around the problem by:
Creating an Excel Application object in Access
Opening the relevant workbook/worksheet
Stepping through the rows and columns of the worksheet to populate an Access
table row by row. For my particular case, this was the most reliable method
because I had no control over how the spreadsheet was formatted and could
not make changes to it.

Creating an Excel Application object; Excel Workbook and Excel Worksheet
object and then retrieving the value of an individual cell as follows:

strRevisedVIN = xlsWorksheet.Range("U" & lngRow).Value

will retrieve the contents of the Excel cell without the type conversion
error that you get when you import or link to the same Excel workbook (go
figure). I can post the code if you want, but it is quite lengthy.
 
R

Ray S.

It sure would be nice to study the code, lengthy or not. This is a recurring
problem for me that has become too cumbersome to handle the way I mentioned.
 
R

Ray S.

By the way Bill, the 208414 solution Microsoft suggests is ridiculous. Why in
the world should we have to put a space into each cell, then remove the space
to get around the fact that even if you format the Excel cells as text Access
ignores that? And, solution 109376 is equally lame and cumbersome. This seems
to me a glaring programming error.
 
B

Bill Edwards

I have to agree that the Microsoft "fixes" are kludges. What makes it
really annoying is that this goes back to at least Access 97. The
monstrosity that follows is code that I wrote because this "issue"
effectively prevented me from linking to an Excel spreadsheet and then
writing a simple update query (probably would have take less than 30 minutes
to write and debug).

Briefly I am connecting to a SQL Server table called VEHFILE that contains
information to do with vehicles (and yes, the same problem exists with SQL
server tables as well as Jet tables). The primary key for the VEHFILE table
is called VEHICLE. Each row in the Excel spreadsheet contains information
to do with a single vehicle. A row in Excel is then used to update the
corresponding VEHICLE row in SQL.

You could do the same thing in Access and simply use an append query, or
your could create an Access/Jet rst object and append to it or update it.



Option Compare Database
Option Explicit

Const strQuote As String = """"
Const intVehicleNumberLength As Integer = 8
Const intVehicleMakeLength As Integer = 15
Const intVehicleModelLength As Integer = 15
Const intSerialNumberLength As Integer = 15
Const intLicensePlateLength As Integer = 15
Const intMPICNumberLength As Integer = 20
Const intDivisionLength As Integer = 10

Public Function UpdateXL_SQL(ByVal strXLFileName As String, _
ByVal strServerName As String, ByVal strDatabaseName As String, ByVal
strSecurity As String, _
ByVal strUserName As String, ByVal strPassword As String, _
ByVal strReportOnly As String, ByVal strLastDate As String) As Date
On Error GoTo Err_Label

Dim intLogFile As Integer
Dim cnn As ADODB.Connection
Dim rst As ADODB.Recordset
Dim xlsApp As Excel.Application
Dim xlsWorkBook As Excel.Workbook
Dim xlsWorksheet As Excel.Worksheet
Dim lngRow As Long
Dim strSource As String
Dim strErrorMessage As String
Dim strConnect As String
Dim varVehicle As Variant
' Dim variables that correspond to table fields:
Dim strType As String
Dim strDepartment As String
Dim strRevisedNumber As String
Dim strRevisedType As String
Dim strRevisedYear As String
Dim intRevisedYear As Integer
Dim strRevisedMake As String
Dim strRevisedModel As String
Dim strRevisedVIN As String
Dim strRevisedPlate As String
Dim strRevisedMPI As String
' Dim other stuff:
Dim strUpdate As String
Dim strOldClassCode As String
Dim strOldType As String
Dim lngRecordsAffected As Long
Dim lngRecordsUpdated As Long
Dim lngRowErrors As Long
Dim strNewCityUnitNumber As String
Dim strVehicleLocation As String

Dim datLastDate As Date ' this is the last date previously processed
Dim datMaxDate As Date ' this will store the last date processed in the
spreadsheet
datMaxDate = #1/1/2000#
datLastDate = CDate(strLastDate)

If LCase(strSecurity) = "trusted" Then
' if this is a trused connection according to the configuration file attach
using trusted security
strConnect = "Provider=sqloledb;Data Source=" _
& strServerName & ";Initial Catalog=" & strDatabaseName & ";Integrated
Security=SSPI;"
Else
' if this is not a trusted connection according to the configuratin file,
attach
' using SQL server security
strConnect = "Provider=sqloledb;Data Source=" _
& strServerName & ";Initial Catalog=" & strDatabaseName _
& ";UID=" & strUserName & ";PWD=" & strPassword & ";"
End If

' Connect to your SQL Server Database
Set cnn = New ADODB.Connection
cnn.Open strConnect
Set rst = New ADODB.Recordset

' Set up the Excel application object, workbook object and worksheet object
Set xlsApp = New Excel.Application
Set xlsWorkBook = xlsApp.Workbooks.Open(strXLFileName)
Set xlsWorksheet = xlsWorkBook.Worksheets(1)

' Open a log file that records the date, time this was run with the user
name who ran it:
intLogFile = FreeFile
Open "Renumber_log_" & Format(Now(), "yyyymmdd_hhnn") & ".txt" For Output As
#intLogFile
Print #intLogFile, _
"Run by User: " & UserName() & " at " & Format(Now(), "yyyymmdd_hhnn")

lngRecordsUpdated = 0
lngRowErrors = 0
lngRow = 2

' check the WFMA Unit Number column (A)

' Loop through the worksheet while Column A contains something:
Do While Len(Trim(xlsWorksheet.Range("A" & lngRow))) > 0

If Len(Trim(xlsWorksheet.Range("Y" & lngRow)) & "") > 0 Then

If xlsWorksheet.Range("Y" & lngRow) > datMaxDate Then
datMaxDate = xlsWorksheet.Range("Y" & lngRow)
End If
If xlsWorksheet.Range("Y" & lngRow) > datLastDate Then
' Using the WFMA unit number, SELECT that vehicles information
' from the VehFile table

strUpdate = ""
strErrorMessage = ""
varVehicle = xlsWorksheet.Range("A" & lngRow).Value

strSource = "SELECT Vehicle, Veh_XRef_Num, Class_Code, Year,
Vehicle_Make, Vehicle_Location, " _
& "Vehicle_Model, Serial_Number, License_Plate, Division,
Registration_Num " _
& "FROM Vehfile WHERE Vehfile.Vehicle = " &
adhHandleQuotes(varVehicle)

rst.Open strSource, cnn, adOpenDynamic, adLockOptimistic

If rst.EOF And rst.BOF Then
' there are no matching records for that row generate an error
message
' no further processing will be done.
strErrorMessage = strErrorMessage & "A:Invalid Vehicle Number"
Else
' fill memory variables with the relevant spreadsheet values
If IsNull(rst.Fields("Vehicle_Location").Value) Then
strVehicleLocation = ""
Else
strVehicleLocation = rst.Fields("Vehicle_Location").Value
End If


strOldClassCode = rst.Fields("Class_Code")
strOldType = Left(strOldClassCode, 3)

' Load memory variables with the contents of Excel Cells:
strType = xlsWorksheet.Range("F" & lngRow).Value
strNewCityUnitNumber = xlsWorksheet.Range("C" & lngRow).Value
strDepartment = xlsWorksheet.Range("M" & lngRow).Value
strRevisedNumber = xlsWorksheet.Range("O" & lngRow).Value
strRevisedType = xlsWorksheet.Range("Q" & lngRow).Value
strRevisedYear = xlsWorksheet.Range("R" & lngRow).Value
strRevisedMake = xlsWorksheet.Range("S" & lngRow).Value
strRevisedModel = xlsWorksheet.Range("T" & lngRow).Value
strRevisedVIN = xlsWorksheet.Range("U" & lngRow).Value
strRevisedPlate = xlsWorksheet.Range("V" & lngRow).Value
strRevisedMPI = xlsWorksheet.Range("W" & lngRow).Value
' Finished loading

' trim leading and trailing spaces in the memory variables
strVehicleLocation = Trim(strVehicleLocation)
strOldClassCode = Trim(strOldClassCode)
strNewCityUnitNumber = Trim(strNewCityUnitNumber)
strDepartment = Trim(strDepartment)
strRevisedNumber = Trim(strRevisedNumber)
strRevisedType = Trim(strRevisedType)
strRevisedYear = Trim(strRevisedYear)
strRevisedMake = Trim(strRevisedMake)
strRevisedModel = Trim(strRevisedModel)
strRevisedVIN = Trim(strRevisedVIN)
strRevisedPlate = Trim(strRevisedPlate)
strRevisedMPI = Trim(strRevisedMPI)

' Do error checking based on business rules:
' we are no longer checking this column
'If Len(strType & "") > 0 Then
' If Len(strType) > 3 Then
' strErrorMessage = strErrorMessage & "F:3Long "
' Else
' ' do not update just flag
' If strOldType <> strType Then
' strErrorMessage = strErrorMessage &
"F:TypeIncorrect "
' End If
' End If
'End If

If Len(strVehicleLocation & "") > 0 Then
If Left(strVehicleLocation, 3) <> Left(strNewCityUnitNumber,
3) Then
strErrorMessage = strErrorMessage & "Invalid
Vehicle_Location "
End If
Else
strErrorMessage = strErrorMessage & "Blank Vehicle_Location
"
End If

If Len(strDepartment & "") > 0 Then
If Trim(rst.Fields("Division").Value) <> strDepartment Then
strErrorMessage = strErrorMessage & "M:InvalidDepartment "
End If
End If

If Len(strRevisedNumber & "") > 0 Then
If Len(strRevisedNumber & "") > intVehicleNumberLength Then
strErrorMessage = strErrorMessage & "O:8Long "
Else
' update
If Trim(rst.Fields("Veh_XRef_Num").Value) <>
strRevisedNumber Then
strUpdate = strUpdate & "Veh_XRef_Num = " &
adhHandleQuotes(strRevisedNumber) & ", "
End If
End If
End If

If Len(strRevisedType & "") > 0 Then
If Len(strRevisedType & "") > 3 Then
strErrorMessage = strErrorMessage & "Q:3Long "
Else
' do not update, just flag if in error
If Left(strNewCityUnitNumber, 3) <> strRevisedType Then
' If strOldType <> strRevisedType Then
strErrorMessage = strErrorMessage & "Q:TypeIncorrect
"
End If
End If
End If

If Len(strRevisedYear & "") > 0 Then
If IsNumeric(strRevisedYear) Then
' update
' SET VehFile.[Year] = val(strRevisedYear)
If rst.Fields("Year").Value <> Val(strRevisedYear) Then
strUpdate = strUpdate & "[Year] = " &
Val(strRevisedYear) & ", "
End If
Else
strErrorMessage = strErrorMessage & "R:NotNumeric "
End If
End If

If Len(strRevisedMake & "") > 0 Then
If Len(strRevisedMake & "") > intVehicleMakeLength Then
strErrorMessage = strErrorMessage & "S:15Long "
Else
'update
' SET VehFile.Vehicle_Make =
adhHandlquotes(strRevisedMake)
If Trim(rst.Fields("Vehicle_Make").Value) <>
strRevisedMake Then
strUpdate = strUpdate & "Vehicle_Make = " &
adhHandleQuotes(strRevisedMake) & ", "
End If
End If
End If

If Len(strRevisedModel & "") > 0 Then
If Len(strRevisedModel & "") > intVehicleModelLength Then
strErrorMessage = strErrorMessage & "T:15Long "
Else
'update
If Trim(rst.Fields("Vehicle_Model").Value) <>
strRevisedModel Then
strUpdate = strUpdate & "Vehicle_Model = " &
adhHandleQuotes(strRevisedModel) & ", "
End If
End If
End If

If Len(strRevisedVIN & "") > 0 Then
If Len(strRevisedVIN & "") > intSerialNumberLength Then
strErrorMessage = strErrorMessage & "U:15Long"
Else
If strRevisedVIN = "CAN'T VERIFY" Then
' do nothing
Else
If Trim(rst.Fields("Serial_Number").Value) <>
strRevisedVIN Then
strUpdate = strUpdate & "Serial_Number = " &
adhHandleQuotes(strRevisedVIN) & ", "
End If
End If
End If
End If

If Len(strRevisedPlate & "") > 0 Then
If Len(strRevisedPlate & "") > intLicensePlateLength Then
strErrorMessage = strErrorMessage & "V:15Long"
Else
If Trim(rst.Fields("License_Plate").Value) <>
strRevisedPlate Then
strUpdate = strUpdate & "License_Plate = " &
adhHandleQuotes(strRevisedPlate) & ", "
End If
End If
End If

If Len(strRevisedMPI & "") > 0 Then
If Len(strRevisedMPI & "") > intMPICNumberLength Then
strErrorMessage = strErrorMessage & "W:20Long"
Else
If strRevisedMPI = "NOT PRESENT" Then
' do nothing
Else
'update
If Trim(rst.Fields("Registration_Num").Value) <>
strRevisedMPI Then
strUpdate = strUpdate & "Registration_Num = " &
adhHandleQuotes(strRevisedMPI) & ", "
End If
End If
End If
End If
End If
' We have finished error checking

' Now actually update the SQL database with the current excel rows
data (if possible) and write to the log file
If Len(strUpdate & "") > 0 Then
strUpdate = Left(strUpdate, Len(strUpdate) - 2)
strUpdate = "UPDATE VehFile SET " & strUpdate & " WHERE
Vehfile.Vehicle = " & adhHandleQuotes(varVehicle) & ";"
If strReportOnly <> "REPORTONLY" Then
cnn.Execute strUpdate, lngRecordsAffected, adCmdText +
adExecuteNoRecords
End If
Print #intLogFile, Format(lngRow, "##########") & " - " &
strUpdate
lngRecordsUpdated = lngRecordsUpdated + lngRecordsAffected

End If

If Len(strErrorMessage & "") > 0 Then
Print #intLogFile, Format(lngRow, "##########") & " - " &
strErrorMessage
lngRowErrors = lngRowErrors + 1
End If
' Debug.Print lngRow & " - " & varVehicle
rst.Close
End If
End If

lngRow = lngRow + 1
Loop

' again write information to a log file
Print #intLogFile, " Worksheet rows processed: " & Format(lngRow,
"##########")
Print #intLogFile, " Records updated: " &
Format(lngRecordsUpdated, "##########")
Print #intLogFile, "Worksheet rows with errors: " & Format(lngRowErrors,
"##########")

Exit_Label:
On Error Resume Next
If Err.Number > 0 Then
Print #intLogFile, "ERROR ENCOUNTERED IN PROCESSING. ROWS MAY NOT
HAVE BEEN PROCESSED"
End If
Close #intLogFile
rst.Close
Set rst = Nothing
cnn.Close
Set cnn = Nothing
' do not bother saving because the workbook is read only anyway
xlsWorkBook.Close False
xlsApp.Quit
UpdateXL_SQL = Format(datMaxDate, "yyyy/mm/dd")
Exit Function
Err_Label:
MsgBox Err.Description & vbCrLf & Err.Number & vbCrLf & lngRow
Resume Exit_Label
End Function

Private Function Quotes(varValue As Variant) As String
On Error GoTo Err_Label
If Len(Trim(varValue)) > 0 Then
varValue = adhHandleQuotes(Trim(varValue))
Else
varValue = "NULL"
End If
Quotes = varValue

Exit_Label:
Exit Function
Err_Label:
MsgBox Err.Number & vbCrLf & Err.Description
Resume Exit_Label
End Function
 
R

Rob Parker

Hi Ray,

A possible alternative to the code Bill Edwards posted in a separate thread
would be to use the TransferSpreadsheet method to import everything in the
spreadsheet into a (different) pre-existing table, with every field (named
F1 to Fn, where n is the number of columns you want to import) set as Text
datatype. Set the HasFieldNames parameter to False, and limit the Range
imported to a fixed number of columns, to prevent errors if there is data in
more columns than you have in your (new) import-dump table - use an entry
such as "!a:g" to import the seven left columns. Set up a query on this
table which converts valid data (eg. use an expression such as
ValidF1:iif(IsNumeric([F1]),[F1],Null) in the fields of interest), and run
it as an append command into your existing table (which has the desired
datatypes).

HTH,

Rob
 
R

Ray S.

Thanks Rob,

I'm studying Bill's code...could you explain your alternative with a little
more detail? I did try to use the TransferSpreadsheet method...importing into
a pre-existing table with the field data-types defined...I played with
designating the field names parameter as false, but that created another
problem because one of the fields is a date field with a long date format
(with several items separated by spaces). This required a dump file with more
fields than the original...would I then query that dump file to concatenate
the date field back to it's original? I'm not sure how you're suggesting to
use IsNumeric. Does that convert a numeric data type to a string? I certainly
don't want the text entries to be converted to null values. I'll try to play
with it.

Rob Parker said:
Hi Ray,

A possible alternative to the code Bill Edwards posted in a separate thread
would be to use the TransferSpreadsheet method to import everything in the
spreadsheet into a (different) pre-existing table, with every field (named
F1 to Fn, where n is the number of columns you want to import) set as Text
datatype. Set the HasFieldNames parameter to False, and limit the Range
imported to a fixed number of columns, to prevent errors if there is data in
more columns than you have in your (new) import-dump table - use an entry
such as "!a:g" to import the seven left columns. Set up a query on this
table which converts valid data (eg. use an expression such as
ValidF1:iif(IsNumeric([F1]),[F1],Null) in the fields of interest), and run
it as an append command into your existing table (which has the desired
datatypes).

HTH,

Rob

Ray S. said:
I've read a lot of posts regarding trouble importing data from Excel into
an
Access table, but nothing seems to work.

The offending data is text. I have formatted the cells in Excel as text.
I'm importing it into a pre-existing table in Access where I have made the
field tata type text.
Most of the data looks like numbers, but it is not. Occasionally a bit of
data will be obviously text, but Access ignores both the data type of the
Excel cells and the data type of the Access field and just gives me a type
conversion error and doesn't import the offending data.
The only solution I have found is to cut the entire range of cells from
Excel and paste them into Notepad, then re-copy them back into Excel.
Isn't
there a less cumbersome way to avoid the data errors and get the data
transferred?
 
R

Rob Parker

Hi Ray,

My suggestion (which I used in an application I built last week) is as
follows:

Set up a table, with every field as a text field, into which the Excel data
is imported via the TransferSpreadsheet command. When you use this command,
with HasFieldNames set to false, you get the contents of the spreadsheet
imported with every field as a text field. You then set up a query on this
table which tests each field to determine if the text representation of the
entry can be interpreted as an entry of your desired datatype.

For example, consider a spreadsheet such as this, where column A has no
specific formatting applied, Column B has a date/time format applied, and
column C has a number format, with 2 decimal places (and numbers may be
entered to greater precision):

Text Date Number
sample1 13/05/2007 9:25 123.45
sample2 4/07/2011 17:25 78.00
sample3 23/02/2004 14:42 not a number
sample4 not a date 456.79

[Note: the alignment of these columns in this posting will almost certainly
be dreadful!!!]

I use the following statement to import this into tblImportRaw, which is a
table containing three text fields (F1, F2, F3):
DoCmd.TransferSpreadsheet acImport, , "tblImportRaw", "c:\book1.xls",
False, "!a:c"
(As I said in my original post, the Range parameter will prevent an error if
the spreadsheet has data in any other column.)

tblImportRaw now contains the following data, all as text strings:

Text Date Number
sample1 39215.392361 123.45
sample2 40728.725694 78
sample3 38040.6125 not a number
sample4 not a date 456.79

The following query converts valid dates (which are now shown as numbers, in
the Microsoft standard format (days since 30-Dec-1899 as the integer
portion, time as the decimal portion) because the spreadsheet has that
column formatted as a date/time) in F2 into date/time entries, and valid
numbers in F3 into double entries:

SELECT F1 AS F1Text,
IIf(IsNumeric([F2]),CDate(CDbl([F2])),Null) AS F2Date,
IIf(IsNumeric([F3]),CDbl([F3]),Null) AS F3Number
FROM tblImportRaw;

The Isnumeric function is a boolean function; I use it to test whether the
string can be interpreted as a number. If so, I convert it to a number,
using the CDbl function, otherwise, I return a null. For the date/time
field, I first test for numeric, then convert from the double value to the
Date/Time datatype using CDate. Attempting to do this without the
intermediate conversion gives a type conversion error, which show in the
query as #error, and cannot be trapped by the IsError function (I think
because it's a run-time error).

The output of this query is:

Text
sample1 13/05/2007 9:25:00 AM 123.45
sample2 4/07/2011 5:25:00 PM 78
sample3 23/02/2004 2:42:00 PM
sample4 456.78901234

This query can be converted into an append query to append the converted
data into a table with three fields, with types of Text, Date/Time, and
Double, with no data conversion errors.

HTH,

Rob


Ray S. said:
Thanks Rob,

I'm studying Bill's code...could you explain your alternative with a
little
more detail? I did try to use the TransferSpreadsheet method...importing
into
a pre-existing table with the field data-types defined...I played with
designating the field names parameter as false, but that created another
problem because one of the fields is a date field with a long date format
(with several items separated by spaces). This required a dump file with
more
fields than the original...would I then query that dump file to
concatenate
the date field back to it's original? I'm not sure how you're suggesting
to
use IsNumeric. Does that convert a numeric data type to a string? I
certainly
don't want the text entries to be converted to null values. I'll try to
play
with it.

Rob Parker said:
Hi Ray,

A possible alternative to the code Bill Edwards posted in a separate
thread
would be to use the TransferSpreadsheet method to import everything in
the
spreadsheet into a (different) pre-existing table, with every field
(named
F1 to Fn, where n is the number of columns you want to import) set as
Text
datatype. Set the HasFieldNames parameter to False, and limit the Range
imported to a fixed number of columns, to prevent errors if there is data
in
more columns than you have in your (new) import-dump table - use an entry
such as "!a:g" to import the seven left columns. Set up a query on this
table which converts valid data (eg. use an expression such as
ValidF1:iif(IsNumeric([F1]),[F1],Null) in the fields of interest), and
run
it as an append command into your existing table (which has the desired
datatypes).

HTH,

Rob

Ray S. said:
I've read a lot of posts regarding trouble importing data from Excel
into
an
Access table, but nothing seems to work.

The offending data is text. I have formatted the cells in Excel as
text.
I'm importing it into a pre-existing table in Access where I have made
the
field tata type text.
Most of the data looks like numbers, but it is not. Occasionally a bit
of
data will be obviously text, but Access ignores both the data type of
the
Excel cells and the data type of the Access field and just gives me a
type
conversion error and doesn't import the offending data.
The only solution I have found is to cut the entire range of cells from
Excel and paste them into Notepad, then re-copy them back into Excel.
Isn't
there a less cumbersome way to avoid the data errors and get the data
transferred?
 
R

Ray S.

OK, I am able to import the data without errors, but the raw data imports in
the Excel file column headings as the first row of data. Is there a simple
way to avoid that?

Rob Parker said:
Hi Ray,

My suggestion (which I used in an application I built last week) is as
follows:

Set up a table, with every field as a text field, into which the Excel data
is imported via the TransferSpreadsheet command. When you use this command,
with HasFieldNames set to false, you get the contents of the spreadsheet
imported with every field as a text field. You then set up a query on this
table which tests each field to determine if the text representation of the
entry can be interpreted as an entry of your desired datatype.

For example, consider a spreadsheet such as this, where column A has no
specific formatting applied, Column B has a date/time format applied, and
column C has a number format, with 2 decimal places (and numbers may be
entered to greater precision):

Text Date Number
sample1 13/05/2007 9:25 123.45
sample2 4/07/2011 17:25 78.00
sample3 23/02/2004 14:42 not a number
sample4 not a date 456.79

[Note: the alignment of these columns in this posting will almost certainly
be dreadful!!!]

I use the following statement to import this into tblImportRaw, which is a
table containing three text fields (F1, F2, F3):
DoCmd.TransferSpreadsheet acImport, , "tblImportRaw", "c:\book1.xls",
False, "!a:c"
(As I said in my original post, the Range parameter will prevent an error if
the spreadsheet has data in any other column.)

tblImportRaw now contains the following data, all as text strings:

Text Date Number
sample1 39215.392361 123.45
sample2 40728.725694 78
sample3 38040.6125 not a number
sample4 not a date 456.79

The following query converts valid dates (which are now shown as numbers, in
the Microsoft standard format (days since 30-Dec-1899 as the integer
portion, time as the decimal portion) because the spreadsheet has that
column formatted as a date/time) in F2 into date/time entries, and valid
numbers in F3 into double entries:

SELECT F1 AS F1Text,
IIf(IsNumeric([F2]),CDate(CDbl([F2])),Null) AS F2Date,
IIf(IsNumeric([F3]),CDbl([F3]),Null) AS F3Number
FROM tblImportRaw;

The Isnumeric function is a boolean function; I use it to test whether the
string can be interpreted as a number. If so, I convert it to a number,
using the CDbl function, otherwise, I return a null. For the date/time
field, I first test for numeric, then convert from the double value to the
Date/Time datatype using CDate. Attempting to do this without the
intermediate conversion gives a type conversion error, which show in the
query as #error, and cannot be trapped by the IsError function (I think
because it's a run-time error).

The output of this query is:

Text
sample1 13/05/2007 9:25:00 AM 123.45
sample2 4/07/2011 5:25:00 PM 78
sample3 23/02/2004 2:42:00 PM
sample4 456.78901234

This query can be converted into an append query to append the converted
data into a table with three fields, with types of Text, Date/Time, and
Double, with no data conversion errors.

HTH,

Rob


Ray S. said:
Thanks Rob,

I'm studying Bill's code...could you explain your alternative with a
little
more detail? I did try to use the TransferSpreadsheet method...importing
into
a pre-existing table with the field data-types defined...I played with
designating the field names parameter as false, but that created another
problem because one of the fields is a date field with a long date format
(with several items separated by spaces). This required a dump file with
more
fields than the original...would I then query that dump file to
concatenate
the date field back to it's original? I'm not sure how you're suggesting
to
use IsNumeric. Does that convert a numeric data type to a string? I
certainly
don't want the text entries to be converted to null values. I'll try to
play
with it.

Rob Parker said:
Hi Ray,

A possible alternative to the code Bill Edwards posted in a separate
thread
would be to use the TransferSpreadsheet method to import everything in
the
spreadsheet into a (different) pre-existing table, with every field
(named
F1 to Fn, where n is the number of columns you want to import) set as
Text
datatype. Set the HasFieldNames parameter to False, and limit the Range
imported to a fixed number of columns, to prevent errors if there is data
in
more columns than you have in your (new) import-dump table - use an entry
such as "!a:g" to import the seven left columns. Set up a query on this
table which converts valid data (eg. use an expression such as
ValidF1:iif(IsNumeric([F1]),[F1],Null) in the fields of interest), and
run
it as an append command into your existing table (which has the desired
datatypes).

HTH,

Rob

I've read a lot of posts regarding trouble importing data from Excel
into
an
Access table, but nothing seems to work.

The offending data is text. I have formatted the cells in Excel as
text.
I'm importing it into a pre-existing table in Access where I have made
the
field tata type text.
Most of the data looks like numbers, but it is not. Occasionally a bit
of
data will be obviously text, but Access ignores both the data type of
the
Excel cells and the data type of the Access field and just gives me a
type
conversion error and doesn't import the offending data.
The only solution I have found is to cut the entire range of cells from
Excel and paste them into Notepad, then re-copy them back into Excel.
Isn't
there a less cumbersome way to avoid the data errors and get the data
transferred?
 
R

Rob Parker

Hi Ray,

Sure. Just set the HasFieldNames parameter in the TransferSpreadsheet
statement to True, as in this example:
DoCmd.TransferSpreadsheet acImport, , "tblImportRaw", "c:\book1.xls",
True, "!a:c"

I was importing with the column headings, and checking that the columns were
correct (by a dcount on the raw imported data to make sure that each field
contained the expected heading text within the data for that field) before
doing any further processing, showing a user-friendly message if they were
wrong. (And I found an instance in the sample data provided by a client
where the columns in one sheet were incorrect - so it was worthwhile doing!)
I would later set criteria on the processed data to only use records for
which data is not null. If you don't want/need to do that, just change the
HasFieldNames parameter.

I think you should be there now,

HTH,

Rob


Ray S. said:
OK, I am able to import the data without errors, but the raw data imports
in
the Excel file column headings as the first row of data. Is there a simple
way to avoid that?

Rob Parker said:
Hi Ray,

My suggestion (which I used in an application I built last week) is as
follows:

Set up a table, with every field as a text field, into which the Excel
data
is imported via the TransferSpreadsheet command. When you use this
command,
with HasFieldNames set to false, you get the contents of the spreadsheet
imported with every field as a text field. You then set up a query on
this
table which tests each field to determine if the text representation of
the
entry can be interpreted as an entry of your desired datatype.

For example, consider a spreadsheet such as this, where column A has no
specific formatting applied, Column B has a date/time format applied, and
column C has a number format, with 2 decimal places (and numbers may be
entered to greater precision):

Text Date Number
sample1 13/05/2007 9:25 123.45
sample2 4/07/2011 17:25 78.00
sample3 23/02/2004 14:42 not a number
sample4 not a date 456.79

[Note: the alignment of these columns in this posting will almost
certainly
be dreadful!!!]

I use the following statement to import this into tblImportRaw, which is
a
table containing three text fields (F1, F2, F3):
DoCmd.TransferSpreadsheet acImport, , "tblImportRaw", "c:\book1.xls",
False, "!a:c"
(As I said in my original post, the Range parameter will prevent an error
if
the spreadsheet has data in any other column.)

tblImportRaw now contains the following data, all as text strings:

Text Date Number
sample1 39215.392361 123.45
sample2 40728.725694 78
sample3 38040.6125 not a number
sample4 not a date 456.79

The following query converts valid dates (which are now shown as numbers,
in
the Microsoft standard format (days since 30-Dec-1899 as the integer
portion, time as the decimal portion) because the spreadsheet has that
column formatted as a date/time) in F2 into date/time entries, and valid
numbers in F3 into double entries:

SELECT F1 AS F1Text,
IIf(IsNumeric([F2]),CDate(CDbl([F2])),Null) AS F2Date,
IIf(IsNumeric([F3]),CDbl([F3]),Null) AS F3Number
FROM tblImportRaw;

The Isnumeric function is a boolean function; I use it to test whether
the
string can be interpreted as a number. If so, I convert it to a number,
using the CDbl function, otherwise, I return a null. For the date/time
field, I first test for numeric, then convert from the double value to
the
Date/Time datatype using CDate. Attempting to do this without the
intermediate conversion gives a type conversion error, which show in the
query as #error, and cannot be trapped by the IsError function (I think
because it's a run-time error).

The output of this query is:

Text
sample1 13/05/2007 9:25:00 AM 123.45
sample2 4/07/2011 5:25:00 PM 78
sample3 23/02/2004 2:42:00 PM
sample4 456.78901234

This query can be converted into an append query to append the converted
data into a table with three fields, with types of Text, Date/Time, and
Double, with no data conversion errors.

HTH,

Rob


Ray S. said:
Thanks Rob,

I'm studying Bill's code...could you explain your alternative with a
little
more detail? I did try to use the TransferSpreadsheet
method...importing
into
a pre-existing table with the field data-types defined...I played with
designating the field names parameter as false, but that created
another
problem because one of the fields is a date field with a long date
format
(with several items separated by spaces). This required a dump file
with
more
fields than the original...would I then query that dump file to
concatenate
the date field back to it's original? I'm not sure how you're
suggesting
to
use IsNumeric. Does that convert a numeric data type to a string? I
certainly
don't want the text entries to be converted to null values. I'll try to
play
with it.

:

Hi Ray,

A possible alternative to the code Bill Edwards posted in a separate
thread
would be to use the TransferSpreadsheet method to import everything in
the
spreadsheet into a (different) pre-existing table, with every field
(named
F1 to Fn, where n is the number of columns you want to import) set as
Text
datatype. Set the HasFieldNames parameter to False, and limit the
Range
imported to a fixed number of columns, to prevent errors if there is
data
in
more columns than you have in your (new) import-dump table - use an
entry
such as "!a:g" to import the seven left columns. Set up a query on
this
table which converts valid data (eg. use an expression such as
ValidF1:iif(IsNumeric([F1]),[F1],Null) in the fields of interest), and
run
it as an append command into your existing table (which has the
desired
datatypes).

HTH,

Rob

I've read a lot of posts regarding trouble importing data from Excel
into
an
Access table, but nothing seems to work.

The offending data is text. I have formatted the cells in Excel as
text.
I'm importing it into a pre-existing table in Access where I have
made
the
field tata type text.
Most of the data looks like numbers, but it is not. Occasionally a
bit
of
data will be obviously text, but Access ignores both the data type
of
the
Excel cells and the data type of the Access field and just gives me
a
type
conversion error and doesn't import the offending data.
The only solution I have found is to cut the entire range of cells
from
Excel and paste them into Notepad, then re-copy them back into
Excel.
Isn't
there a less cumbersome way to avoid the data errors and get the
data
transferred?
 
R

Ray S.

Sorry Rob, but when I change the HasFieldNames parameter to True I get the
same import errors. I only do not get them if I set the HasFieldNames to
False, but then I get the problem I'm now trying to solve.

Rob Parker said:
Hi Ray,

Sure. Just set the HasFieldNames parameter in the TransferSpreadsheet
statement to True, as in this example:
DoCmd.TransferSpreadsheet acImport, , "tblImportRaw", "c:\book1.xls",
True, "!a:c"

I was importing with the column headings, and checking that the columns were
correct (by a dcount on the raw imported data to make sure that each field
contained the expected heading text within the data for that field) before
doing any further processing, showing a user-friendly message if they were
wrong. (And I found an instance in the sample data provided by a client
where the columns in one sheet were incorrect - so it was worthwhile doing!)
I would later set criteria on the processed data to only use records for
which data is not null. If you don't want/need to do that, just change the
HasFieldNames parameter.

I think you should be there now,

HTH,

Rob


Ray S. said:
OK, I am able to import the data without errors, but the raw data imports
in
the Excel file column headings as the first row of data. Is there a simple
way to avoid that?

Rob Parker said:
Hi Ray,

My suggestion (which I used in an application I built last week) is as
follows:

Set up a table, with every field as a text field, into which the Excel
data
is imported via the TransferSpreadsheet command. When you use this
command,
with HasFieldNames set to false, you get the contents of the spreadsheet
imported with every field as a text field. You then set up a query on
this
table which tests each field to determine if the text representation of
the
entry can be interpreted as an entry of your desired datatype.

For example, consider a spreadsheet such as this, where column A has no
specific formatting applied, Column B has a date/time format applied, and
column C has a number format, with 2 decimal places (and numbers may be
entered to greater precision):

Text Date Number
sample1 13/05/2007 9:25 123.45
sample2 4/07/2011 17:25 78.00
sample3 23/02/2004 14:42 not a number
sample4 not a date 456.79

[Note: the alignment of these columns in this posting will almost
certainly
be dreadful!!!]

I use the following statement to import this into tblImportRaw, which is
a
table containing three text fields (F1, F2, F3):
DoCmd.TransferSpreadsheet acImport, , "tblImportRaw", "c:\book1.xls",
False, "!a:c"
(As I said in my original post, the Range parameter will prevent an error
if
the spreadsheet has data in any other column.)

tblImportRaw now contains the following data, all as text strings:

Text Date Number
sample1 39215.392361 123.45
sample2 40728.725694 78
sample3 38040.6125 not a number
sample4 not a date 456.79

The following query converts valid dates (which are now shown as numbers,
in
the Microsoft standard format (days since 30-Dec-1899 as the integer
portion, time as the decimal portion) because the spreadsheet has that
column formatted as a date/time) in F2 into date/time entries, and valid
numbers in F3 into double entries:

SELECT F1 AS F1Text,
IIf(IsNumeric([F2]),CDate(CDbl([F2])),Null) AS F2Date,
IIf(IsNumeric([F3]),CDbl([F3]),Null) AS F3Number
FROM tblImportRaw;

The Isnumeric function is a boolean function; I use it to test whether
the
string can be interpreted as a number. If so, I convert it to a number,
using the CDbl function, otherwise, I return a null. For the date/time
field, I first test for numeric, then convert from the double value to
the
Date/Time datatype using CDate. Attempting to do this without the
intermediate conversion gives a type conversion error, which show in the
query as #error, and cannot be trapped by the IsError function (I think
because it's a run-time error).

The output of this query is:

Text
sample1 13/05/2007 9:25:00 AM 123.45
sample2 4/07/2011 5:25:00 PM 78
sample3 23/02/2004 2:42:00 PM
sample4 456.78901234

This query can be converted into an append query to append the converted
data into a table with three fields, with types of Text, Date/Time, and
Double, with no data conversion errors.

HTH,

Rob


Thanks Rob,

I'm studying Bill's code...could you explain your alternative with a
little
more detail? I did try to use the TransferSpreadsheet
method...importing
into
a pre-existing table with the field data-types defined...I played with
designating the field names parameter as false, but that created
another
problem because one of the fields is a date field with a long date
format
(with several items separated by spaces). This required a dump file
with
more
fields than the original...would I then query that dump file to
concatenate
the date field back to it's original? I'm not sure how you're
suggesting
to
use IsNumeric. Does that convert a numeric data type to a string? I
certainly
don't want the text entries to be converted to null values. I'll try to
play
with it.

:

Hi Ray,

A possible alternative to the code Bill Edwards posted in a separate
thread
would be to use the TransferSpreadsheet method to import everything in
the
spreadsheet into a (different) pre-existing table, with every field
(named
F1 to Fn, where n is the number of columns you want to import) set as
Text
datatype. Set the HasFieldNames parameter to False, and limit the
Range
imported to a fixed number of columns, to prevent errors if there is
data
in
more columns than you have in your (new) import-dump table - use an
entry
such as "!a:g" to import the seven left columns. Set up a query on
this
table which converts valid data (eg. use an expression such as
ValidF1:iif(IsNumeric([F1]),[F1],Null) in the fields of interest), and
run
it as an append command into your existing table (which has the
desired
datatypes).

HTH,

Rob

I've read a lot of posts regarding trouble importing data from Excel
into
an
Access table, but nothing seems to work.

The offending data is text. I have formatted the cells in Excel as
text.
I'm importing it into a pre-existing table in Access where I have
made
the
field tata type text.
Most of the data looks like numbers, but it is not. Occasionally a
bit
of
data will be obviously text, but Access ignores both the data type
of
the
Excel cells and the data type of the Access field and just gives me
a
type
conversion error and doesn't import the offending data.
The only solution I have found is to cut the entire range of cells
from
Excel and paste them into Notepad, then re-copy them back into
Excel.
Isn't
there a less cumbersome way to avoid the data errors and get the
data
transferred?
 
R

Ron2006

Sorry Rob, but when I change the HasFieldNames parameter to True I get the
same import errors. I only do not get them if I set the HasFieldNames to
False, but then I get the problem I'm now trying to solve.



Rob Parker said:
Sure. Just set the HasFieldNames parameter in the TransferSpreadsheet
statement to True, as in this example:
DoCmd.TransferSpreadsheet acImport, , "tblImportRaw", "c:\book1.xls",
True, "!a:c"
I was importing with the column headings, and checking that the columns were
correct (by a dcount on the raw imported data to make sure that each field
contained the expected heading text within the data for that field) before
doing any further processing, showing a user-friendly message if they were
wrong. (And I found an instance in the sample data provided by a client
where the columns in one sheet were incorrect - so it was worthwhile doing!)
I would later set criteria on the processed data to only use records for
which data is not null. If you don't want/need to do that, just change the
HasFieldNames parameter.
I think you should be there now,

Ray S. said:
OK, I am able to import the data without errors, but the raw data imports
in
the Excel file column headings as the first row of data. Is there a simple
way to avoid that?
:
Hi Ray,
My suggestion (which I used in an application I built last week) is as
follows:
Set up a table, with every field as a text field, into which the Excel
data
is imported via the TransferSpreadsheet command. When you use this
command,
with HasFieldNames set to false, you get the contents of the spreadsheet
imported with every field as a text field. You then set up a query on
this
table which tests each field to determine if the text representation of
the
entry can be interpreted as an entry of your desired datatype.
For example, consider a spreadsheet such as this, where column A has no
specific formatting applied, Column B has a date/time format applied, and
column C has a number format, with 2 decimal places (and numbers may be
entered to greater precision):
Text Date Number
sample1 13/05/2007 9:25 123.45
sample2 4/07/2011 17:25 78.00
sample3 23/02/2004 14:42 not a number
sample4 not a date 456.79
[Note: the alignment of these columns in this posting will almost
certainly
be dreadful!!!]
I use the following statement to import this into tblImportRaw, which is
a
table containing three text fields (F1, F2, F3):
DoCmd.TransferSpreadsheet acImport, , "tblImportRaw", "c:\book1.xls",
False, "!a:c"
(As I said in my original post, the Range parameter will prevent an error
if
the spreadsheet has data in any other column.)
tblImportRaw now contains the following data, all as text strings:
Text Date Number
sample1 39215.392361 123.45
sample2 40728.725694 78
sample3 38040.6125 not a number
sample4 not a date 456.79
The following query converts valid dates (which are now shown as numbers,
in
the Microsoft standard format (days since 30-Dec-1899 as the integer
portion, time as the decimal portion) because the spreadsheet has that
column formatted as a date/time) in F2 into date/time entries, and valid
numbers in F3 into double entries:
SELECT F1 AS F1Text,
IIf(IsNumeric([F2]),CDate(CDbl([F2])),Null) AS F2Date,
IIf(IsNumeric([F3]),CDbl([F3]),Null) AS F3Number
FROM tblImportRaw;
The Isnumeric function is a boolean function; I use it to test whether
the
string can be interpreted as a number. If so, I convert it to a number,
using the CDbl function, otherwise, I return a null. For the date/time
field, I first test for numeric, then convert from the double value to
the
Date/Time datatype using CDate. Attempting to do this without the
intermediate conversion gives a type conversion error, which show in the
query as #error, and cannot be trapped by the IsError function (I think
because it's a run-time error).
The output of this query is:
Text
sample1 13/05/2007 9:25:00 AM 123.45
sample2 4/07/2011 5:25:00 PM 78
sample3 23/02/2004 2:42:00 PM
sample4 456.78901234
This query can be converted into an append query to append the converted
data into a table with three fields, with types of Text, Date/Time, and
Double, with no data conversion errors.
HTH,
Rob
Thanks Rob,
I'm studying Bill's code...could you explain your alternative with a
little
more detail? I did try to use the TransferSpreadsheet
method...importing
into
a pre-existing table with the field data-types defined...I played with
designating the field names parameter as false, but that created
another
problem because one of the fields is a date field with a long date
format
(with several items separated by spaces). This required a dump file
with
more
fields than the original...would I then query that dump file to
concatenate
the date field back to it's original? I'm not sure how you're
suggesting
to
use IsNumeric. Does that convert a numeric data type to a string? I
certainly
don't want the text entries to be converted to null values. I'll try to
play
with it.
:
Hi Ray,
A possible alternative to the code Bill Edwards posted in a separate
thread
would be to use the TransferSpreadsheet method to import everything in
the
spreadsheet into a (different) pre-existing table, with every field
(named
F1 to Fn, where n is the number of columns you want to import) set as
Text
datatype. Set the HasFieldNames parameter to False, and limit the
Range
imported to a fixed number of columns, to prevent errors if there is
data
in
more columns than you have in your (new) import-dump table - use an
entry
such as "!a:g" to import the seven left columns. Set up a query on
this
table which converts valid data (eg. use an expression such as
ValidF1:iif(IsNumeric([F1]),[F1],Null) in the fields of interest), and
run
it as an append command into your existing table (which has the
desired
datatypes).
HTH,
Rob
I've read a lot of posts regarding trouble importing data from Excel
into
an
Access table, but nothing seems to work.
The offending data is text. I have formatted the cells in Excel as
text.
I'm importing it into a pre-existing table in Access where I have
made
the
field tata type text.
Most of the data looks like numbers, but it is not. Occasionally a
bit
of
data will be obviously text, but Access ignores both the data type
of
the
Excel cells and the data type of the Access field and just gives me
a
type
conversion error and doesn't import the offending data.
The only solution I have found is to cut the entire range of cells
from
Excel and paste them into Notepad, then re-copy them back into
Excel.
Isn't
there a less cumbersome way to avoid the data errors and get the
data
transferred?- Hide quoted text -

- Show quoted text -
 
R

Ron2006

When importing from excell:

1) Always import into a work table not the final table where the data
will reside.
One reason for this is that depending on the import sheet there may be
blank lines or lines you do not want.

2) Run a delete query on this work table to
a) drop the blank lines (and any other lines you do NOT want).
b) drop the record that corresponds to the headings.

3) Run an append query using the work table as input. As a part of
step one run a delete query to empty the work table.

The advantage to this is that if you have problems later on you can at
lease see what the program imported from within Access and do not have
to assume what you think it imported by looking at the spreadsheet.
Another advantage is that you can add edit criteria queries to
determine if you really want to use this data or reject the whole
import process.

Ron
 
R

Rob Parker

Hi again Ray,

What I've sugggested is essentially the same as Ron2006 has posted in
another reply to this.

Import with HasFieldNames set to false, into the raw import table. Then
pre-process your data with a query, to convert to appropriate datatypes.
Then append, using criteria to reject rows which you don't want - such as
the Headings row from your spreadsheet, which will not have converted
numbers or date/time fields and will (if you used the method I previously
posted) have Null in those fields. So, in your Append query, set criteria
of Is Not Null for those fields to get your real data. You might not want
to set this criteria on every field, if your spreadsheet does contain blank
cells in the actual data; you should set it for all fields which do (or
should) have data.

HTH,

Rob

Ray S. said:
Sorry Rob, but when I change the HasFieldNames parameter to True I get the
same import errors. I only do not get them if I set the HasFieldNames to
False, but then I get the problem I'm now trying to solve.

Rob Parker said:
Hi Ray,

Sure. Just set the HasFieldNames parameter in the TransferSpreadsheet
statement to True, as in this example:
DoCmd.TransferSpreadsheet acImport, , "tblImportRaw", "c:\book1.xls",
True, "!a:c"

I was importing with the column headings, and checking that the columns
were
correct (by a dcount on the raw imported data to make sure that each
field
contained the expected heading text within the data for that field)
before
doing any further processing, showing a user-friendly message if they
were
wrong. (And I found an instance in the sample data provided by a client
where the columns in one sheet were incorrect - so it was worthwhile
doing!)
I would later set criteria on the processed data to only use records for
which data is not null. If you don't want/need to do that, just change
the
HasFieldNames parameter.

I think you should be there now,

HTH,

Rob


Ray S. said:
OK, I am able to import the data without errors, but the raw data
imports
in
the Excel file column headings as the first row of data. Is there a
simple
way to avoid that?

:

Hi Ray,

My suggestion (which I used in an application I built last week) is as
follows:

Set up a table, with every field as a text field, into which the Excel
data
is imported via the TransferSpreadsheet command. When you use this
command,
with HasFieldNames set to false, you get the contents of the
spreadsheet
imported with every field as a text field. You then set up a query on
this
table which tests each field to determine if the text representation
of
the
entry can be interpreted as an entry of your desired datatype.

For example, consider a spreadsheet such as this, where column A has
no
specific formatting applied, Column B has a date/time format applied,
and
column C has a number format, with 2 decimal places (and numbers may
be
entered to greater precision):

Text Date Number
sample1 13/05/2007 9:25 123.45
sample2 4/07/2011 17:25 78.00
sample3 23/02/2004 14:42 not a number
sample4 not a date 456.79

[Note: the alignment of these columns in this posting will almost
certainly
be dreadful!!!]

I use the following statement to import this into tblImportRaw, which
is
a
table containing three text fields (F1, F2, F3):
DoCmd.TransferSpreadsheet acImport, , "tblImportRaw",
"c:\book1.xls",
False, "!a:c"
(As I said in my original post, the Range parameter will prevent an
error
if
the spreadsheet has data in any other column.)

tblImportRaw now contains the following data, all as text strings:

Text Date Number
sample1 39215.392361 123.45
sample2 40728.725694 78
sample3 38040.6125 not a number
sample4 not a date 456.79

The following query converts valid dates (which are now shown as
numbers,
in
the Microsoft standard format (days since 30-Dec-1899 as the integer
portion, time as the decimal portion) because the spreadsheet has that
column formatted as a date/time) in F2 into date/time entries, and
valid
numbers in F3 into double entries:

SELECT F1 AS F1Text,
IIf(IsNumeric([F2]),CDate(CDbl([F2])),Null) AS F2Date,
IIf(IsNumeric([F3]),CDbl([F3]),Null) AS F3Number
FROM tblImportRaw;

The Isnumeric function is a boolean function; I use it to test whether
the
string can be interpreted as a number. If so, I convert it to a
number,
using the CDbl function, otherwise, I return a null. For the
date/time
field, I first test for numeric, then convert from the double value to
the
Date/Time datatype using CDate. Attempting to do this without the
intermediate conversion gives a type conversion error, which show in
the
query as #error, and cannot be trapped by the IsError function (I
think
because it's a run-time error).

The output of this query is:

Text
sample1 13/05/2007 9:25:00 AM 123.45
sample2 4/07/2011 5:25:00 PM 78
sample3 23/02/2004 2:42:00 PM
sample4 456.78901234

This query can be converted into an append query to append the
converted
data into a table with three fields, with types of Text, Date/Time,
and
Double, with no data conversion errors.

HTH,

Rob


Thanks Rob,

I'm studying Bill's code...could you explain your alternative with a
little
more detail? I did try to use the TransferSpreadsheet
method...importing
into
a pre-existing table with the field data-types defined...I played
with
designating the field names parameter as false, but that created
another
problem because one of the fields is a date field with a long date
format
(with several items separated by spaces). This required a dump file
with
more
fields than the original...would I then query that dump file to
concatenate
the date field back to it's original? I'm not sure how you're
suggesting
to
use IsNumeric. Does that convert a numeric data type to a string? I
certainly
don't want the text entries to be converted to null values. I'll try
to
play
with it.

:

Hi Ray,

A possible alternative to the code Bill Edwards posted in a
separate
thread
would be to use the TransferSpreadsheet method to import everything
in
the
spreadsheet into a (different) pre-existing table, with every field
(named
F1 to Fn, where n is the number of columns you want to import) set
as
Text
datatype. Set the HasFieldNames parameter to False, and limit the
Range
imported to a fixed number of columns, to prevent errors if there
is
data
in
more columns than you have in your (new) import-dump table - use an
entry
such as "!a:g" to import the seven left columns. Set up a query
on
this
table which converts valid data (eg. use an expression such as
ValidF1:iif(IsNumeric([F1]),[F1],Null) in the fields of interest),
and
run
it as an append command into your existing table (which has the
desired
datatypes).

HTH,

Rob

I've read a lot of posts regarding trouble importing data from
Excel
into
an
Access table, but nothing seems to work.

The offending data is text. I have formatted the cells in Excel
as
text.
I'm importing it into a pre-existing table in Access where I have
made
the
field tata type text.
Most of the data looks like numbers, but it is not. Occasionally
a
bit
of
data will be obviously text, but Access ignores both the data
type
of
the
Excel cells and the data type of the Access field and just gives
me
a
type
conversion error and doesn't import the offending data.
The only solution I have found is to cut the entire range of
cells
from
Excel and paste them into Notepad, then re-copy them back into
Excel.
Isn't
there a less cumbersome way to avoid the data errors and get the
data
transferred?
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top