From: | Frank Bax <fbax(at)sympatico(dot)ca> |
---|---|
To: | pgsql-sql(at)postgresql(dot)org |
Subject: | Re: SEVEN cross joins?!?!? |
Date: | 2005-10-19 10:42:33 |
Message-ID: | 5.2.1.1.0.20051019054006.0455bec0@pop6.sympatico.ca |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-sql |
At 09:04 AM 10/13/05, Daryl Richter wrote:
>Frank Bax wrote:
>
>[snip]
>
>>Richard, you've summed it up nicely.
>>Splitting locations into subsets (like 2,2,3) doesn't work because it is
>>possible that low values in one location can be offset by high values in
>>another location, and still result in an excellent combo.
>>The good news is these suggestions got me thinking outside the box. I
>>think I can program a modified brute-force that bypasses large numbers of
>>combos early. It might still be too large/slow, so I'd be interested in
>>finding more info about these "smarter algorithms" in option 2.
>>Where do I look?
>
>If you're mathematically inclined, I would first look at using
>Lagrangian Relexation, it may be appropriate for your problem:
>
>http://www.2112fx.com/lagrange.html
Thanks, but that's a little too complex for me to turn into code!
I did rewrite my code from a simple cross join SQL in PHP to custom
searching in perl. I sucked subselects into arrays and then looked at all
possible combinations.
For those that forgot/missed the background, my table has 514 rows. Using
subselect, this table is split into 7 subtables. These seven subtables are
cross joined with each other to produce 770 billion rows that need to be
searched (to assemble a 'made-to-order' suit of armour).
By using more intelligent code (and not simple brute-force), I was able to
analyse a complete set of 770 billion states in just under 70 hours on a
P4-2.8Ghz system, which is fast enough for today. A faster cpu will help,
since process does no i/o except at beginning and end of script. I realised
that if I am ever able to figure out coding for multi-processor or systems
(even remote like set(at)home), I can exploit either/both of these for this
problem by slitting problem on items in first subtable into 50-60 subtasks,
then merging results from each of those subtasks. This might become a
requirement if the underlying table grows to be quite large.
Thanks for pointing me in the right direction, it's been an interesting week.
Frank
From | Date | Subject | |
---|---|---|---|
Next Message | Richard Huxton | 2005-10-19 10:43:10 | Re: Field Separator not working? |
Previous Message | Mario Splivalo | 2005-10-19 10:19:12 | Field Separator not working? |