I have user records with a moderate amount of information in each record
(~200 bytes).
Some columns change very often, others very seldom, and some are read
very often whereas other are seldom needed.
Does it improve performance to use several tables with a common key
(i.e. account_id) so that when querying for a list of accounts with a
given property the DBMS only has to sift through shorter records? Or
does the cost of ever having to join two or more of these tables always
overwhelms the gains searching a single table?
I understand that's probably a 'depends on what you use the DBMS for',
but if there are general rules of thumb, or Postgres specific stuff, I'd
love to know.
Thanks.
--Maurice