1

I had heavy SQL dump of a table. I used bigdump lib to import it in MySql database on my server. Although it worked fine, but now I have duplicated entries in that table. same table on local server has 8 * 105 records but on server it has 15 * 105 records.

Can you suggest me a query to delete duplicate entries from this table? Here is my table structure.

enter image description here

Table name is : techdata_products

P.S. This table does not have any primary key.

2
  • 1
    Would recommend to copy un-duplicated list to another table select distinct * from techdata_products into temp_techdata_products; Commented Oct 28, 2011 at 19:13
  • 1
    Are the 'dupe' records exact copies of each other (all fields equal) or are only SOME of the fields equal? Commented Oct 28, 2011 at 19:18

2 Answers 2

2

SQL is not my strong point but I think you can export the result of this query:

SELECT DISTINCT * FROM table; 

And then, create a new table and import your results.

Sign up to request clarification or add additional context in comments.

Comments

2

First starters why do you have no primary key? You could have simply made that id field that auto increments a primary key to prevent duplicates. My suggestion would be to create a new table and do a

Select Distinct * from table and put the results into a new table that has a primary key

3 Comments

well mate, it has 15 lac records and MySql script would not work smoothly for it.
can you provide an example of what your dupe records look like? That would help give a better understanding of how to write the script.
also have you tried writing any delete queries that you think could work or have tested in a dev environment but might just need tweaking?

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.