Hi,

I have an entity called Property that has a field ot type text with a machine name of field_data_field_prop_code.
I have written code to create Property entities from an uploaded list - (I am migrating from one old system to this new one - passing old property data in via json).
The code is like the following:
global $user;
$values = array(
'type' => 'property',
'title' => $json_prop->name . ' (' . $json_prop->code . ')',
'uid' => $user->uid,
'status' => 1,
'created' => 1,
'comment' => 1,
'promote' => 0,
);
$entity = entity_create('node', $values);
$ewrapper = entity_metadata_wrapper('node', $entity);
$ewrapper->field_prop_code->set($json_prop->code);
$ewrapper->save();

It works fine for first couple of props and then I have been getting SQLSTATE[23000]: Integrity constraint violation: 1062 Duplicate entry 'node-10352-0-0-und' for key 'PRIMARY', query: INSERT INTO {field_data_field_prop_code} error.

I have now figured out that this issue is that the db table field_data_field_prop_code has entries that refer to old entities that no longer exist, but whose entity_id is > than current max node nid, hence when I try to create a node and the node id match one of these orphaned entries I get the Duplicate entry exception.

When I run select max(nid) from node; I get 9959
and when I run select max(entity_id) from field_data_field_prop_code; I get 10626.

I have been working on this project for many months, so obviously overtime the field data has got out of whack.
So my question is how do I clear these out in a clean and safe way?

I have dug around and can't find a solution, so I am going to back up my db and try the following:

function clear_out_field_tables_for_mig()
{
$output = 'clear_out_field_tables_for_mig';
$sql = "show tables like '%field_data%';";
clear_out_rest_tables($sql);
$sql = "show tables like '%field_revision%';";
clear_out_rest_tables($sql);

return $output;
}

function clear_out_rest_tables($sql)
{
$results = db_query($sql);
foreach ($results as $row)
{
$table_name = reset($row);
$count_sql = "select max(entity_id) from $table_name;";
$count_results = db_fetch_object(db_query($count_sql));
$count = reset($count_results);
$count = isset($count) ? $count : 0;
if ($count > 9959) {
$delete_sql = "delete from $table_name where entity_id > 9959;";
db_query($delete_sql);
drupal_set_message('deleted entries in ' . $table_name);
}

}
}

Comments

John_B’s picture

You seem to have answered your own question. This kind of thing in Drupal is often a real headache, and interventions in the database turn out to have ramifications you did not expect. I might have been tempted to try as an alternative method exporting all nodes (maybe with Node Export module), deleting all nodes, and re-importing all nodes.

Digit Professionals specialising in Drupal, WordPress & CiviCRM support for publishers in non-profit and related sectors

gony’s picture

Thanks J. It appears to have worked - fingers xssed!