i m using mysql. i would like to know is there a limit of no of row of tables?

i used to develop application using sybase. i do not have experience in developing applications involve very very large tables. i had experience with tables having few millions rows only.

i m curious about how many rows of table mysql can support. if, say, can mysql handle table having over 100million rows? even if mysql does support hugh tables, the performance would inevitably be lowered. i know using multiple database servers is a must. but is durpal having any standard strategies to deal with hugh tables? will a table with hugh amount of records be broken down into several smaller tables automatically?

i ask this because i just inspected the database and read the book Pro Drupal Development. i found that almost everything is node. almost every record will create an entry in node table. the table will inevitably accumulate to have hugh amount of records. how will mysql deal with tables having, say, 100million rows? the table will be of over 10G of size.

i don't have much experience in very hugh database. as drupal is developed by experts in several fields over the world, i would like to raise this question here.

thanks a lot.

Comments

nevets’s picture

I have to question your benchmark of 100 million rows. Drupal.org has roughly a quarter million rows in the node table (or about 1/400th of your benchmark) and has been around for a while and is active, That said Drupal.org has some special needs though my understanding is that it has too do more with traffic.

sisyphus’s picture

well, the benchmark maybe just a kind of guestimation.

i m setting up a site for my students using drupal. i just guessing if they continue using the site for blogging and discussion even they become alumnus after leaving school, say, over 15 years, it is not impossible the node table will accumulate to few millions records.

well, and social networking like facebook, bebo, myspace is popular. and i often heard about that these sites having users up to over 10 million. i just thinking how they handle that hugh amount of records? (just curious) and i would like to know how drupal will handle such amount of records while maintaining the performance. i read the Pro Drupal Development book. at the later chapter of the book, it talks about the performance and scalability, but hasn't mentioned hugh database / tables issues.

thanks.

testjimmy’s picture

I am curious on this too. I know Modx uses a similar approach to store all site contents in one table and if you have lots of content many function (like creating a list of all your articles) become extremely slow. How does drupal overcome this in terms of performance?