This is weird... I have an instance where the "Stage MARC for import" tool is not working. *** Interface testing I can... - Go to /cgi-bin/koha/tools/stage-marc-import.pl and choose the file to upload and click on "Upload file" - The "Look for existing records in catalog?" and "Check for embedded item record data?" fields are then shown as expected: http://div.libriotech.no/files/2015/stage-fail/stage1.png - If I then click on "Stage for import", the next page reports it has successfully found the records: http://div.libriotech.no/files/2015/stage-fail/stage2.png - The "Manage staged records"-link is to e.g. /cgi-bin/koha/tools/manage-marc-import.pl?import_batch_id=34 Where 34 is the next available id in the "import_batches" table. - BUT if I click on that link, it shows "No matching records found": http://div.libriotech.no/files/2015/stage-fail/stage3.png - In fact that page looks the same if i replace 34 in the URL with e.g. 10000 - the batch does not exist - If I check the database, nothing has been saved in the import_batches table - But if I repeat these steps it keeps counting, so I get batch 35, 36 etc *** Different instances I have tested this on 3 different instances on 2 different servers, all with instances running off the 3.18.3 packages: InstanceA on Server1 - fail InstanceB on Server1 - success InstanceC on Server2 - success If I take an SQL-dump from InstanceA and load it into InstanceC (on Server2), I get: InstanceC on Server2 - fail ... which would suggest that the problem lies in the data, not in the code. InstanceA has been through several upgrades, InstanceB was created specifically to test the problem described here. *** Logs No relevant errors are reported in /var/log/koha/InstanceA/intranet-error.log I have turned on the binary log in MySQL, but the query to add the batch to the import_batches table is not recorded there (but somehow the counter is still incremented...) *** Code As far as I can tell... stage-marc-import.pl calls out to C4::ImportBatch::BatchStageMarcRecords(): my ( $batch_id, $num_valid, $num_items, @import_errors ) = BatchStageMarcRecords( $record_type, $encoding, $marcrecord, $filename, $marc_modification_template, $comments, '', $parse_items, 0, 50, staging_progress_callback( $job, $dbh ) ); BatchStageMarcRecords() then calls out to C4::ImportBatch::AddImportBatch(): my $batch_id = AddImportBatch( { overlay_action => 'create_new', import_status => 'staging', batch_type => 'batch', file_name => $file_name, comments => $comments, record_type => $record_type, } ); AddImportBatch does this: my (@fields, @vals); foreach (qw( matcher_id template_id branchcode overlay_action nomatch_action item_action import_status batch_type file_name comments record_type )) { if (exists $params->{$_}) { push @fields, $_; push @vals, $params->{$_}; } } my $dbh = C4::Context->dbh; $dbh->do("INSERT INTO import_batches (".join( ',', @fields).") VALUES (".join( ',', map '?', @fields).")", undef, @vals); return $dbh->{'mysql_insertid'}; I have tried several things with this code: - The return value from $dbh->do is 1, which should indicate success - I have tacked on a "or die $dbh->errstr" after the do, but it did not report any errors: $dbh->do("INSERT INTO import_batches (".join( ',', @fields).") VALUES (".join( ',', map '?', @fields).")", undef, @vals) or die $dbh->errstr; - I have tried dumping the contents of @fields and @vals, and it looks good - I have pulled the construction of the SQL out from the do() and logged it with a warn, but the generated SQL looks good: my $sql = "INSERT INTO import_batches (".join( ',', @fields).") VALUES (".join( ',', map '?', @fields).")"; warn $sql; $dbh->do($sql, undef, @vals); *** Inserting manually I have tried to log into MySQL and execute the SQL that I got from the code manually, and it works as it should: mysql> INSERT INTO import_batches (overlay_action,import_status,batch_type,file_name,comments,record_type) VALUES ('create_new','staging','batch','test-2-records.mrc','','biblio'); Query OK, 1 row affected (0.00 sec) *** mysqlcheck I have run mysqlcheck on the database (via koha-mysqlcheck), but it does not report any problems for any of the tables in the database. *** Table comparison Output from "SHOW CREATE TABLE import_batches;" on InstanceA (failing) and InstanceB (not failing): | import_batches | CREATE TABLE `import_batches` ( `import_batch_id` int(11) NOT NULL AUTO_INCREMENT, `matcher_id` int(11) DEFAULT NULL, `template_id` int(11) DEFAULT NULL, `branchcode` varchar(10) DEFAULT NULL, `num_records` int(11) NOT NULL DEFAULT '0', `num_items` int(11) NOT NULL DEFAULT '0', `upload_timestamp` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP, `overlay_action` enum('replace','create_new','use_template','ignore') NOT NULL DEFAULT 'create_new', `nomatch_action` enum('create_new','ignore') NOT NULL DEFAULT 'create_new', `item_action` enum('always_add','add_only_for_matches','add_only_for_new','ignore','replace') NOT NULL DEFAULT 'always_add', `import_status` enum('staging','staged','importing','imported','reverting','reverted','cleaned') NOT NULL DEFAULT 'staging', `batch_type` enum('batch','z3950','webservice') NOT NULL DEFAULT 'batch', `file_name` varchar(100) DEFAULT NULL, `comments` mediumtext, `record_type` enum('biblio','auth','holdings') NOT NULL DEFAULT 'biblio', PRIMARY KEY (`import_batch_id`), KEY `branchcode` (`branchcode`) ) ENGINE=InnoDB AUTO_INCREMENT=40 DEFAULT CHARSET=utf8 | +----------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+ | import_batches | CREATE TABLE `import_batches` ( `import_batch_id` int(11) NOT NULL AUTO_INCREMENT, `matcher_id` int(11) DEFAULT NULL, `template_id` int(11) DEFAULT NULL, `branchcode` varchar(10) DEFAULT NULL, `num_records` int(11) NOT NULL DEFAULT '0', `num_items` int(11) NOT NULL DEFAULT '0', `upload_timestamp` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP, `overlay_action` enum('replace','create_new','use_template','ignore') NOT NULL DEFAULT 'create_new', `nomatch_action` enum('create_new','ignore') NOT NULL DEFAULT 'create_new', `item_action` enum('always_add','add_only_for_matches','add_only_for_new','ignore','replace') NOT NULL DEFAULT 'always_add', `import_status` enum('staging','staged','importing','imported','reverting','reverted','cleaned') NOT NULL DEFAULT 'staging', `batch_type` enum('batch','z3950','webservice') NOT NULL DEFAULT 'batch', `record_type` enum('biblio','auth','holdings') NOT NULL DEFAULT 'biblio', `file_name` varchar(100) DEFAULT NULL, `comments` mediumtext, PRIMARY KEY (`import_batch_id`), KEY `branchcode` (`branchcode`) ) ENGINE=InnoDB AUTO_INCREMENT=2 DEFAULT CHARSET=utf8 | As far as I can tell, the only difference is in the order of the columns. But given the form of the query, this should not be a problem, I think: INSERT INTO import_batches (overlay_action,import_status,batch_type,file_name,comments,record_type) VALUES ('create_new','staging','batch','test-2-records.mrc','','biblio'); *** Recreating tables Tried dropping all import_* tables and then recreating them from kohastructure.sql. Still no luck.