Hi,

I am loading data from a CSV file into a content type and everything is working fine except when I try to import a long text field / comment field. The data in the comment field should be treated as straight text.

Here is my data I am trying to upload via feeds:

LP 8/28/13 strategy:  extend to 2020. LP 4/8 LL First Allied rejected concept. LP 4/17 meeting: 5 yrs. LP 9/24/14 strategy: 10 yrs.3/2-Spoke w/Jordan Raines-First Allied last week. Submitted LOI on 3/2.

When I try to upload this, I receive the error message:

SQLSTATE[22007]: Invalid datetime format: 1366 Incorrect string value: '\xFFexten...' for column 'field_store_comments_value' at row 1.

SQLSTATE[22007]: Invalid datetime format: 1366 Incorrect string value: '\xFFexten...' for column 'message' at row 1

So my question is, how do I get Feeds import of a text string into a long text field to ignore the date strings in the comment? I do not want the dates treated as dates. I want the dates to be treated as a string of characters.

Thanks,

Dennis

Comments

densolis’s picture

Issue summary: View changes
densolis’s picture

Issue summary: View changes
densolis’s picture

I figured it out. It is not a Feeds issue. It is a source file issue.

The source file in an Excel spreadsheet, which creates a csv file using Ascii encoding. Unfortunately (or fortunately - depending upon how you look at it), Feeds was able to import every single field except the 3 long text fields.

Well, if the long text fields had 30 to 50 characters, it was fine. But my data was longer than that.

I was able to resolve my issue by using Notepad++ to read in the CSV file and change the encoding from Ascit to UTF-8.

You can change a file's encoding by editing the file in Notepad++ and choosing the Encoding option on the main menu. This is a drop down menu of radio buttons. I change my file's encoding from Ascii to UFT-8 and then saved the file.

I ran the feeds import and it this time everything worked fine. The long text fields were imported correctly.

Dennis

densolis’s picture

Status: Active » Closed (works as designed)
densolis’s picture

I found that if I change the file to UFT-8, then Excel will not work with the CSV. It treats the CSV UFT-8 file as a fixed length record. But if you use Notepad++ to change the encoding to "Encoding in UTF-8 without BOM" then both Drupal and Excel are happy.

After doing a bit more research, I found this:

The UTF-8 BOM is a sequence of bytes (EF BB BF) that allows the reader to identify a file as being encoded in UTF-8.

Normally, the BOM is used to signal the endianness of an encoding, but since endianness is irrelevant to UTF-8, the BOM is unnecessary.

According to the Unicode standard, the BOM for UTF-8 files is not recommended.