database - MySQL Merge tables - high-traffic and large amounts of data -
Get link
Facebook
X
Pinterest
Email
Other Apps
My work currently uses MySQL (MyISAM) specifically for all data collection, currently we have 300 There are more webservers and there are around 150 databases. Unfortunately I'm in a position to write a table structure to support more than 100 million rows over a period of 30 days. The idea is:
Inserts in high volumes (no updates are removed or always at the end of the table)
1 row selection
Old data is thrown in comparison to 30 days
The best solution is to select a table for each day in the merge table. Actually there will be duplicate data, but SELECT will only pull a recent line based on a timestamp and a full field. Obviously 30 tables are not ideal, but rather they go to life.
Is there any inherent flaw with this approach? Are there any other ways to approach this thing that I'm missing (we got stuck on 5.0)? Can table locking be a big issue when the new day's table is created while ALTER TABLE on the merge table? We currently have a table rotation structure, but if we go with a single table, then the data we want from the old table will be very slow, because it will reach 100 million rows.
There are other technologies to accomplish this in a great way, but our sales team has already sold the solution and we do not have the luxury of time.
Any input will be appreciated.
structure:
make table 'merge_test_1' (`dated_stamp`) long tap,` hash` four (32) not tap, `p_id` medium (8) unsigned No 'zero', 'A_ID' medium (8) unsigned, tap 'b_id' medium (8) unsigned null, primary key (`hash`,` p_id`, `date_stamp`)) engine = mice