For my photography site I have three tables relevant for this question. One for gallery categories, one for photographs, and one to link photographs to a category. The last table is to allow a one to many relationship between photographs and categories.
The user will specify what photographs belong to a category on a web form, and PHP will process the POST.
I want to prevent a photograph from being added to a category more than once, and to handle attempts at duplicates in an elegant manner.
I have a hunch that doing a query before each insert is not the best way to deal with this! So I am asking for advise.
The table structures are:
“gallery_category” => “CREATE TABLE photographs (
category_id int unsigned NOT NULL AUTO_INCREMENT PRIMARY KEY,
category_order int unsigned,
category_name varchar(50),
category_description text
) TYPE=MyISAM;”,
“photographs” => “CREATE TABLE photographs (
photo_id int unsigned NOT NULL AUTO_INCREMENT PRIMARY KEY,
photo_filename varchar(50),
photo_alt_text varchar(50),
photo_caption text
) TYPE=MyISAM;”,
“cat_photos” => CREATE TABLE cat_photos {
cat_photo_id int unsigned NOT NULL AUTO_INCREMENT PRIMARY KEY,
category_id int unsigned,
photo_id int unsigned,
active int unsigned,
order int unsigned
) TYPE=MyISAM;",
A duplicate is defined as rows in cat_photos with the same values for category_id and photo_id.
In PHP I use PDP and do my SQL in a try structure. So I can trap the errors gracefully. And easily process records in a loop.
If the user submits, say 10 photos to be added to a category, and one is a duplicate, I want the nine to update and the one to fail in a manner that I can report back via a PHP driven page.
As I said, I can see to this with a query before each insert, but I suspect that there is a better way.
Better table definition, perhaps? To get an error on the INSERT that I can trap?
Thanks to all who reply.