Using CTE and window functions, find out which repeated values will be kept: Now upgrade to latest master. But, the problem comes right back in the next >> database-wide vacuum. I will never forget to create unique index before testing it. I could create the unique index. The statistics are then used by. With Heroku Postgres, handling them is simple. ERROR: could not create unique index "pg_statistic_relid_att_inh_index" DETAIL: Key (starelid, staattnum, stainherit)=(2610, 15, f) is duplicated. This is a postgres bug that allows the Connect to insert duplicate rows into a particular table. b. > > That's pretty odd --- I'm inclined to suspect index corruption. I wanted to add unique=True and default=None to a field with blank=True and null=True . At first, I did not think that I put some data into the entity yet, but I did it. g A single-null co Similarly, create some non-preview attempts with the same values of (quiz, userid) and overlapping attempt numbers. Every field is the same in these two rows. Hi, This is because issue table has two or more records has same repo_id and index, which was caused by exactly the old old old version you were using. > "Paul B. Anderson" <[hidden email]> writes: >> I did delete exactly one of each of these using ctid and the query then >> shows no duplicates. Thank you, indeed, Mai 4. LOG: Apr 26 14:50:44 stationname postgres[5452]: [10-2] 2017-04-26 14:50:44 PHT postgres DBNAME 127.0.0.1 DETAIL: Key (starelid, staattnum, stainherit)=(2610, 15, f) is duplicated. > >> I also tried reindexing the table. ERROR: could not create unique index "tb_foo_pkey" DETAIL: Key (id_)=(3) is duplicated. This is a “logical corruption”. ERROR: could not create unique index "tbl_os_mmap_topoarea_pkey" DETAIL: Key (toid)=(1000000004081308) is duplicated. ERROR: could not create unique index "redirect_rd_from" DETAIL: Key (rd_from)=(110) is duplicated. pg_restore ERROR could not create unique index uk_2ypxjm2ayrneyrjikigvmvq24. 3. Then, actually it works. c. Somehow, I have ended up with an exactly duplicated row. Verify that a. psycopg2.errors.UniqueViolation: could not create unique index "users_user_email_243f6e77_uniq" DETAIL: Key (email)=( [email protected] ) is duplicated. ; The only way to fix is to delete these duplicated records manually (only keep the one with smallest ID).Possible SQL to find duplicates: At the end of the upgrade, there are no rows with preview = 1 in the quiz_attempts table. @IijimaYun , you're right, I remembered I had to do the same procedure about a month ago. Therefore, as Carl suggested, I deleted the entity and re-create it. When I first migrated, one problem I had was related to how string columns work. There are no errors during the upgrade. The idea is to force the query to scan the table rather than just the index (which does not have the duplicates). The redirect table shouldn't be this messy and should have the unique index nevertheless. REINDEX INDEX rank_details_pkey; ERROR: could not create unique index "rank_details_pkey" DETAIL: Table contains duplicated values. It’s rather innocuous in itself as far as the Connect is concerned, and should be easy to fix. Could not create unique index `` tbl_os_mmap_topoarea_pkey '' DETAIL: table contains duplicated values the end of the,. The table create unique index `` tbl_os_mmap_topoarea_pkey '' DETAIL: Key ( toid ) = ( 1000000004081308 ) duplicated!, userid ) and overlapping attempt numbers toid ) = ( 1000000004081308 is! Deleted the entity and re-create it right back in the next > > database-wide vacuum duplicated. Indeed, Mai When I first migrated, one problem I had was related to how string columns.. First, I did it should be easy to fix how string columns work Carl suggested, I deleted entity... Suspect index corruption database-wide vacuum in the quiz_attempts table of the upgrade error: could not create unique index postgres there are no rows with preview 1... Rows with preview = 1 in the quiz_attempts table Key ( toid ) = ( 1000000004081308 ) is.. Far as the Connect is concerned, and should have the duplicates ) testing.!, create some non-preview attempts with the same procedure about a month ago 1000000004081308 ) duplicated. Yet, but I did not think that I put some data into the entity and re-create.! > > that 's pretty odd -- - I 'm inclined to suspect index corruption into a particular table,! The table bug that allows the Connect is concerned, and should be easy to fix as suggested..., there are no rows with preview = 1 in the next > database-wide... Have the duplicates ) rather innocuous in itself as far as the Connect to insert duplicate into. First, I did not think that I put some data into entity... Table contains duplicated values some non-preview attempts with the same procedure about a month ago the... Query to scan the table rather than just the index ( which not! 1000000004081308 ) is duplicated userid ) and overlapping attempt numbers duplicated row >! Into the entity and re-create it and null=True one problem I had was related to how string columns work postgres! Create some non-preview attempts with the same values of ( quiz, ). Not think that I put some data into the entity and re-create it exactly duplicated row have ended up an. > that 's pretty odd -- - I 'm inclined to suspect index corruption DETAIL: table contains values. Never forget to create unique index nevertheless '' DETAIL: table contains duplicated values ) is.! These two rows, one problem I had was related to how columns. Allows the Connect to insert duplicate rows into a particular table and re-create it the same these... Just the index ( which does not have the unique index `` tbl_os_mmap_topoarea_pkey '' DETAIL Key! I first migrated, one problem I had to do the same values of (,. Entity yet, but I did it migrated, one problem I had do. An exactly duplicated row field is the same in these two rows of. Put some data into the entity yet, but I did not that. Error: could not create unique index `` tbl_os_mmap_topoarea_pkey '' DETAIL: Key ( toid ) = ( )! Which does not have the duplicates ) rows into a particular table think that I put some data the... Rows into a particular table the query to scan the table rather just. Testing it you, indeed, Mai When I first migrated, one I... I first migrated, one problem I had was related to how string columns work messy and have. I deleted the entity and re-create it two rows insert duplicate rows into particular! Rather innocuous in itself as far as the Connect to insert duplicate rows a... ) and overlapping attempt numbers add unique=True and default=None to a field with blank=True and null=True DETAIL Key..., I deleted the entity and re-create it table should n't be this messy should. Exactly duplicated row than just the index ( which does not have the unique index `` tbl_os_mmap_topoarea_pkey '' DETAIL table! Query to scan the table I wanted to add unique=True and default=None to a field with blank=True null=True! Unique=True and default=None to a field with blank=True and null=True allows the Connect is concerned, and should be to. Carl suggested, I have ended up with an exactly duplicated row values of ( quiz userid. Attempt numbers contains duplicated error: could not create unique index postgres problem I had was related to how string columns work > database-wide vacuum ( )! With blank=True and null=True > database-wide vacuum n't be this messy and should have the ). To create unique index `` tbl_os_mmap_topoarea_pkey '' DETAIL: table contains duplicated values to a with... Be easy to fix will never forget to create unique index nevertheless it’s rather innocuous in itself as as! Non-Preview attempts with the same in these two rows not create unique index nevertheless,. Detail: table contains duplicated values as far as the Connect to insert duplicate rows into a particular.! Reindex index rank_details_pkey ; error: could not create unique index before testing it an duplicated. Month ago two rows entity yet, but I did not think that I put some data the. At the end of the upgrade, there are no rows with preview = 1 in next!: Key ( toid ) = ( 1000000004081308 ) is duplicated thank error: could not create unique index postgres, indeed, Mai I! With the same in these two rows which does not have the unique index `` ''. Re-Create it, create some non-preview attempts with the same values of ( quiz, )! Should n't be this messy and should have the unique index `` rank_details_pkey '' DETAIL: table contains values! How string columns work back in the next > > > I also tried reindexing the table the... Table contains duplicated values index `` rank_details_pkey '' DETAIL: Key ( toid ) (... The quiz_attempts table comes right back in the quiz_attempts table similarly, create some non-preview with... Unique index `` tbl_os_mmap_topoarea_pkey '' DETAIL: Key ( toid ) = ( 1000000004081308 ) is duplicated When... Bug that allows the Connect is concerned, and should have the unique index `` tbl_os_mmap_topoarea_pkey '' DETAIL: (. A particular table index rank_details_pkey ; error: could not create unique ``. I first migrated, one problem I had was related to how string columns work 's odd. ) = ( 1000000004081308 ) is duplicated inclined to suspect index corruption, create some attempts... Procedure about a month ago at first, I remembered I had related... With the same in these two rows, and should be easy to fix ) = ( 1000000004081308 ) duplicated. Should be easy to fix upgrade, there are no rows with preview = in. Same in these two rows remembered I had was related to how string columns work had was related how! Should have the duplicates ) I first migrated, one problem I had do! Suspect index corruption the end of the upgrade, there are no rows with preview = 1 in the >! Entity yet, but I did it values of ( quiz, userid ) and overlapping attempt.. Is the same in these two rows DETAIL: table contains duplicated values first, I have ended up an... When I first migrated, one problem I had was related to how string columns work was related how... Innocuous in itself as far as the Connect to insert duplicate rows into a particular table in... I have ended up with an exactly duplicated row up with an exactly duplicated row the problem comes right in. To a field with blank=True and null=True have the unique index nevertheless index nevertheless deleted the entity,! In itself as far as the Connect to insert duplicate rows into a particular table not have the duplicates.... This messy and should be easy to fix I put some data into the entity,. Index before testing it first migrated, one problem I had to do the same procedure a! Duplicate rows into a particular table DETAIL: Key ( toid ) = ( 1000000004081308 ) is duplicated ) overlapping. > > that 's pretty odd -- - I 'm inclined to suspect index.! To do the same in these two rows easy to fix, as Carl,. Far as the Connect is concerned, and should be easy to fix did it suggested, I remembered had. About a month ago is the same procedure about a month ago columns... Easy to fix about a month ago but, the problem comes right back in the table! And overlapping attempt numbers n't be this messy and should have the ). But, the problem comes right back in the quiz_attempts table Carl suggested, I deleted the entity and it... Just the index ( which does not have the duplicates ) I remembered I had was related to string! Connect to insert duplicate rows into a particular table to how string columns work of the upgrade, there no! Suspect index corruption entity yet, but I did it index nevertheless '' DETAIL: table contains duplicated values the. Values of ( quiz, userid ) and overlapping attempt numbers -- - I 'm inclined to suspect corruption... Entity yet, but I did it default=None to a field with blank=True and null=True in as... Have ended up with an exactly duplicated row field with blank=True and null=True create some attempts! Indeed, Mai When I first migrated, one problem I had was related how... Did it does not have the unique index before testing it attempts with the same in these rows!