mersenneforum.org  

Go Back   mersenneforum.org > Great Internet Mersenne Prime Search > Software

Reply
 
Thread Tools
Old 2003-11-22, 01:02   #1
GP2
 
GP2's Avatar
 
Sep 2003

258510 Posts
Default A distributed-computing project to optimize GIMPS FFT? Genetic algorithms

Here's a wacky idea that may or may not be implementable.

First, some lengthy background:

From New Scientist, 15 November 1997:
http://www.newscientist.com/hottopics/ai/primordial.jsp
using genetic algorithms to "evolve'" the performance of hardware. Fascinating, a must-read.

The above article was the basis for:
The Distributed Hardware Evolution Project,
a recently-begun distributed computing project.

See also their official forum hosted by Free-DC.

Finally, consider this article that ran in Slashdot a few days ago:

Using genetic algorithms to optimize the optimization flags given to a C compiler:
http://developers.slashdot.org/devel...id=126&tid=156

Inside the Slashdot story is a link to the original article:
http://www.coyotegulch.com/acovea/index.html


Finally, putting it all together:
would it be possible to use genetic algorithms to optimize the FFT assembly code for each chip type and each FFT size? This could be incorporated right into the Prime95 client and report its results to the new version of the server.

All 40,000+ Prime95 boxes out there could spend 15 minutes a day running new generations of the genetic algorithm... each would be extremely quick, a simple benchmark.

If I'm not mistaken, some of the major optimizations that George found were discovered by accident: he noticed that the debug version of the code ran faster than the non-debug version, and investigated why.

The truly fascinating thing about genetic algorithms is that they're capable of "evolving" optimizations that a human designer sometimes can't even comprehend when examining the end result, let alone be capable of designing.

A self-optimizing distributed computing project... it has a certain logic to it, no?

OK, I've done the hard part of thinking this up... now it's just a matter of the trivial details of implementation.
GP2 is offline   Reply With Quote
Old 2003-11-22, 23:02   #2
jeff8765
 
jeff8765's Avatar
 
Aug 2002
A Dyson Sphere

32×7 Posts
Default

I think that this is a great idea if it is feasable. I remember reading an article in Scientific American about how different things were improved with genetic algorithms like this.
jeff8765 is offline   Reply With Quote
Old 2003-11-23, 05:53   #3
nucleon
 
nucleon's Avatar
 
Mar 2003
Melbourne

5×103 Posts
Default

I'd be interested to see if FFT algorithims could evolve. But can we guarantee the result?

The result could end up giving the correct result _only_ on the test input. Once the algorithim stabilzed through the evolving process, I wonder if _different_ inputs give correct answers.

But like what's been said in the past the current code is highly optimised. Extending what was said in the New Scientist article, the evolving process might use 'non-digital' techniques of the processor which may depend on environmental conditions.

How about if the evolving process was completely done in the northern hemisphere. There could be situations where when it's run on idential hardware in the Southern hemisphere, different results are produced.

Given what I've said above it's still a very interesting experiement.

-- Craig
nucleon is offline   Reply With Quote
Old 2003-11-23, 06:37   #4
GP2
 
GP2's Avatar
 
Sep 2003

5×11×47 Posts
Default

You wouldn't get the kind of weird analog interference effects and temperature dependence that they got in the original New Scientist article. I mean, the end result here is simply an assembly language routine, not an electrical circuit.

It would be closer to the article about varying optimization flags for compiling any given C program. Only here we're dealing with assembler. There are various parameters (with various tradeoffs) that George varies when he tries to optimize, and then he asks people to try benchmarking the modified version. But in all cases you're dealing with a program that gives the correct result -- it's just that some versions are more efficient than others on a given CPU.

With genetic algorithms, you could find an optimum combination of parameters much more readily than brute force testing of every possible permutation. And it would find that optimum combination of parameters even if it seems counterintuitive.
GP2 is offline   Reply With Quote
Old 2003-11-23, 16:23   #5
Dresdenboy
 
Dresdenboy's Avatar
 
Apr 2003
Berlin, Germany

1011010012 Posts
Default

The task is really not easy. Current developments in this area are the self-optimizing FFT algorithms (e.g. FFTW).

These compiler flag optimizations could be applied to existing FFT sources. But that doesn't work for FFTs which are written in Assembler.

If you look into a book about FFT algorithms and applied tricks you'll quickly get the impression that this is a wonderful target for genetic algorithms. Currently that would lead to applying the best tricks on a certain target architecture. But the creation of yet unknown tricks and tweaks would be much more difficult. Or said in a different way: One would have to write a symbolic processor, which could take some math expressions and try to find the optimal implementation, which could be "written" in C++ and then compiled in another step which uses the genetic algorithms to find the optimal compiler flags.

I see this situation a bit like when a human plays a game of chess against a computer program. The program could try a huge amount of combinations much faster than the human player but the latter will use his experience, intuition and strategical/tactical skills.

The text p4notes.doc gives an impression of what the genetic approach would have to try.

Last fiddled with by Dresdenboy on 2003-11-23 at 16:24
Dresdenboy is offline   Reply With Quote
Old 2003-11-24, 18:43   #6
jflin
 
Aug 2003

5 Posts
Default

If this is to be done, it must be done as 2 seperate projects.

1. GA to find the Best FFT algorithm
2. GA to find the "Stumper", in which the result of 1 Fails

You can find similar references to this type of project in the search for a faster Sort algorithm, originally done as an earlyish GA. The purpose of the 2nd GA is to make sure that the first one is robust enough to handle all the cases.

The other option would be to come up with a complete chart of all the possible multiplications, to validate against, but think that that would be to large to be practical.

May want this as a separte option, outside of Prime95. Could have it evolve using some type of "test" harness, and if successful, send it in to George to include in the P95 code. (And maybe get some of the better method money, if you think that way!)
jflin is offline   Reply With Quote
Old 2003-12-03, 17:34   #7
ewmayer
2ω=0
 
ewmayer's Avatar
 
Sep 2002
República de California

2DD016 Posts
Default

Before we can start realistically considering the use of genetic algorithms to construct a non-trivial algorithm like the FFT, let's see if they can do something in a similar vein, but much, much simpler. Example: say I have N complex inputs in x + I*y form, which I want to multiply together in various ways. Assume the number of complex muls is >> N. Can your genetic algorithm figure out that the best way (or at least a better way) to do this is to first convert the multiplicands to complex polar form, then multiply?

GAs are probably quite useful for things like optimizing compile sequences and memory access patterns, or pipelining a given set of machine instructions, but I have my doubts as to whether they're much use for developing fundamental algorithms of the nontrivial variety. Of course the human algorithm designer is also a product of an evolutionary algorithm (with a healthy bit of random chance thrown in, quoth the dinosaur), but I don't feel like waiting a couple of million years for a GA to tell me how to reduce the opcount of my radix-N DFT algorithm by 10 or 20% by way of a previously unheard-of transform. Let's not make the same mistake as Steve Wolfram, of confusing pretty pictures with fundamental insight. No disrespect to Mr. Wolfram, but I have yet to hear of a single basic physical law being elucidated by way of cellular automata. (But that's probably a topic for a different forum.)
ewmayer is offline   Reply With Quote
Old 2003-12-03, 17:49   #8
wblipp
 
wblipp's Avatar
 
"William"
May 2003
New Haven

2,371 Posts
Default

Quote:
Originally posted by ewmayer
GAs are probably quite useful for things like optimizing compile sequences and memory access patterns, or pipelining a given set of machine instructions, but I have my doubts as to whether they're much use for developing fundamental algorithms of the nontrivial variety.
Perhaps I misunderstood the proposal, but I thought the proposal would be to search among things like pipelining and memory access. I imagined that the search space would be variaitions that were constructed to create the same results, but with many possible orderings of the intermediate steps.
wblipp is offline   Reply With Quote
Old 2003-12-03, 18:09   #9
GP2
 
GP2's Avatar
 
Sep 2003

5·11·47 Posts
Default

wblipp is right, I had in mind using GA to optimize the tunable parameters of a known algorithm, not something that could find an entirely unknown algorithm. As in the cited article on finding the best C compiler optimizing flags for a given program... here we'd be looking for the best setting of tunable parameters for a given CPU architecture.
GP2 is offline   Reply With Quote
Old 2003-12-04, 08:20   #10
Dresdenboy
 
Dresdenboy's Avatar
 
Apr 2003
Berlin, Germany

192 Posts
Default

That would be the easiest implementable solution. There are a lot of important macros where the speed just depends on instruction order and alignment. A GA could shuffle the instructions around instead of the programmer.

OTOH there are known hardware features and constraints and one could just use a set of (heuristic) rules to optimize the code.

And I'm sure my new idea regarding memory layout for a FFT couldn't be found that easily using GA's
Dresdenboy is offline   Reply With Quote
Old 2003-12-09, 20:41   #11
Dresdenboy
 
Dresdenboy's Avatar
 
Apr 2003
Berlin, Germany

192 Posts
Default

Covering interesting topics:
http://www.ece.cmu.edu/~spiral/prese...ntel-jul02.pdf

Genetic algorithms, dynamic programming etc. to find optimal DFT formulas.
Dresdenboy is offline   Reply With Quote
Reply

Thread Tools


Similar Threads
Thread Thread Starter Forum Replies Last Post
AlphaZero style distributed computing project for Go and Chess MooMoo2 Lounge 5 2019-05-28 13:22
Considering getting back into distributed computing jasong jasong 7 2016-03-28 14:36
Distributed Factoring and prime testing algorithms bunbun Software 6 2006-03-27 17:14
Distributed Computing Survey garo Lounge 11 2004-09-01 03:31
The difference between P2P and distributed computing and grid computing GP2 Lounge 2 2003-12-03 14:13

All times are UTC. The time now is 23:31.


Thu May 26 23:31:06 UTC 2022 up 42 days, 21:32, 0 users, load averages: 1.60, 1.56, 1.51

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2022, Jelsoft Enterprises Ltd.

This forum has received and complied with 0 (zero) government requests for information.

Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.
A copy of the license is included in the FAQ.

≠ ± ∓ ÷ × · − √ ‰ ⊗ ⊕ ⊖ ⊘ ⊙ ≤ ≥ ≦ ≧ ≨ ≩ ≺ ≻ ≼ ≽ ⊏ ⊐ ⊑ ⊒ ² ³ °
∠ ∟ ° ≅ ~ ‖ ⟂ ⫛
≡ ≜ ≈ ∝ ∞ ≪ ≫ ⌊⌋ ⌈⌉ ∘ ∏ ∐ ∑ ∧ ∨ ∩ ∪ ⨀ ⊕ ⊗ 𝖕 𝖖 𝖗 ⊲ ⊳
∅ ∖ ∁ ↦ ↣ ∩ ∪ ⊆ ⊂ ⊄ ⊊ ⊇ ⊃ ⊅ ⊋ ⊖ ∈ ∉ ∋ ∌ ℕ ℤ ℚ ℝ ℂ ℵ ℶ ℷ ℸ 𝓟
¬ ∨ ∧ ⊕ → ← ⇒ ⇐ ⇔ ∀ ∃ ∄ ∴ ∵ ⊤ ⊥ ⊢ ⊨ ⫤ ⊣ … ⋯ ⋮ ⋰ ⋱
∫ ∬ ∭ ∮ ∯ ∰ ∇ ∆ δ ∂ ℱ ℒ ℓ
𝛢𝛼 𝛣𝛽 𝛤𝛾 𝛥𝛿 𝛦𝜀𝜖 𝛧𝜁 𝛨𝜂 𝛩𝜃𝜗 𝛪𝜄 𝛫𝜅 𝛬𝜆 𝛭𝜇 𝛮𝜈 𝛯𝜉 𝛰𝜊 𝛱𝜋 𝛲𝜌 𝛴𝜎𝜍 𝛵𝜏 𝛶𝜐 𝛷𝜙𝜑 𝛸𝜒 𝛹𝜓 𝛺𝜔