Hi, can I ask some help…my problem is that my table has column let_id as an example. now on first design the let_id is integer, but after few months I noticed that many data are not inserted anymore because the value of my let_id is increasing and the data Type ( Int ) is cannot handle anymore. so I change it to BIGINT and the data is inserted again. my problem is that in the future what if the BIGINT cannot handle anymore ? what is my solution for this ?.
Note: the data is coming from the third party web app, and I just grab the data and inserted it to my table…
Thank you in advance.
INTEGER can handle values up to 2 billion (4 billion if UNSIGNED)
so you’re saying you have that many rows? or is it a smaller table but the third party web app is just using unreasonably large numbers for
BIGINT can handle values up to 9,223,372,036,854,775,807 (twice that if UNSIGNED)
there is no numeric datatype larger
Yes , I have just 1200 rows but the let_id is getting bigger.
Is it possible you’re using the id as a key when it is intended to be only an id?
i don’t mean this to sound sarcastic, but could you rephrase that in english? are you talking about PKs? auto_increments? indexes?
I was wondering if the source data was something like
id = some auto incr number
looks_like_a_table_id = not auto incr numbers
then on the client
id = auto incr number
INSERT SET id VALUE looks_like_a_table_id
might be eating up the range. It’s the only thing I could think of that might explain the choke happening at 1200 rows.
This topic was automatically closed 91 days after the last reply. New replies are no longer allowed.