mersenneforum.org  

Go Back   mersenneforum.org > Great Internet Mersenne Prime Search > Hardware

Reply
 
Thread Tools
Old 2008-05-12, 19:57   #1
petrw1
1976 Toyota Corona years forever!
 
petrw1's Avatar
 
"Wayne"
Nov 2006
Saskatchewan, Canada

19·251 Posts
Default Is PrimeNet / GIMPS considering GPUs?

From: http://www.pcworld.ca//news/column/b...3d0f09/pg1.htm

Quote:
Graphics Acceleration

Thought your fancy video card was only good for gaming? Think again. Its graphics processing unit (GPU) is really like a second, highly specialized CPU. When it comes to certain kinds of complex math, its performance puts your desktop CPU to shame.

Until recently, all that power went to waste when you weren't chalking up frags. But computer scientists are finding novel ways to use GPU acceleration to speed up applications off-screen, as well. For example, a Stanford University project-- which uses many PCs around the world acting together as a supercomputer to assist protein folding-related disease research--can offload calculations to the GPU to multiply its performance many times.

Because the kind of calculations used to draw 3D graphics are also applicable to many other problems, GPU acceleration is potentially useful for a wide variety of applications, from math-intensive science and engineering to complex database queries. Newer, even more complex chips--such as nVidia's Aegia physics engine--can do even more. No wonder nVidia has begun working on chips for the workstation market.

Increasingly, your PC's performance won't depend on the speed of any single chip. As AMD and Intel get into the game, expect future desktop CPUs to incorporate CPU and GPU capabilities into a single, multicore package, bringing the best of both worlds to gamers and nongamers alike.
petrw1 is offline   Reply With Quote
Old 2008-05-12, 22:57   #2
fivemack
(loop (#_fork))
 
fivemack's Avatar
 
Feb 2006
Cambridge, England

11001001010012 Posts
Default

Yes, people contemplate using GPUs quite frequently. The problem is that GPUs want to deal with numbers 16 or 23 bits at a time, and the Prime95 code wants to deal with numbers 53 or 64 bits at a time; whilst you can build high-accuracy operations from smaller ones, it takes so many of the smaller operations that the advantage of the GPU is lost. High-accuracy GPUs have been rumoured to appear by the end of this year - the Intel G965 chipset lets you work with 32-bit numbers, which is nearly enough - but they are likely to be available only in very expensive workstation graphics cards, too expensive to buy one just to play with, though possibly nonetheless better performance-per-dollar than normal processors.

For sieving operations, the problem is that the GPU memory would prefer you to work in chunks of 128 bits, and sieving really likes working with one bit at a time. Again, you can get round it, but at such a cost that the speed advantage is pretty much lost.
fivemack is offline   Reply With Quote
Old 2008-05-13, 03:12   #3
jasonp
Tribal Bullet
 
jasonp's Avatar
 
Oct 2004

3·1,181 Posts
Default

See this thread and this thread for the latest on this oft-mentioned topic. Mods: can someone catalog the threads where this has come up, and make it a sticky or something?
jasonp is offline   Reply With Quote
Old 2008-05-13, 13:12   #4
Xyzzy
 
Xyzzy's Avatar
 
Aug 2002

2×3×5×277 Posts
Default

Quote:
Mods: can someone catalog the threads where this has come up, and make it a sticky or something?
Hmm, you are the mod for this forum…
Xyzzy is offline   Reply With Quote
Old 2008-05-13, 20:23   #5
jasong
 
jasong's Avatar
 
"Jason Goatcher"
Mar 2005

3×7×167 Posts
Default

Quote:
Originally Posted by fivemack View Post
Yes, people contemplate using GPUs quite frequently. The problem is that GPUs want to deal with numbers 16 or 23 bits at a time, and the Prime95 code wants to deal with numbers 53 or 64 bits at a time; whilst you can build high-accuracy operations from smaller ones, it takes so many of the smaller operations that the advantage of the GPU is lost. High-accuracy GPUs have been rumoured to appear by the end of this year - the Intel G965 chipset lets you work with 32-bit numbers, which is nearly enough - but they are likely to be available only in very expensive workstation graphics cards, too expensive to buy one just to play with, though possibly nonetheless better performance-per-dollar than normal processors.

For sieving operations, the problem is that the GPU memory would prefer you to work in chunks of 128 bits, and sieving really likes working with one bit at a time. Again, you can get round it, but at such a cost that the speed advantage is pretty much lost.
I typed a bunch of stuff and then lost it, somehow. grrrrr...

Graphics cards can indeed do LLR lightning fast, it's just the proof hasn't been made public. The person who created the program is very sick, so going public is not something that interests him. In order for it to be made public, he either will have to recover or do something which, since he frequents these forums, I'm not actually going to state. Suffice it say, I strongly prefer the first choice.

If I remember correctly, he said it is relatively easy to port the core code to graphics cards. I´m guessing that the hairy part is getting it to work on a machine where the graphics card is used for the OS as well. Getting it to work on a Linux machine is what he's accomplished,(not sure if it simultaneously displayed a GUI) and porting to Windows was what he was going to attempt when his health took a dive.
jasong is offline   Reply With Quote
Old 2008-05-14, 16:16   #6
davieddy
 
davieddy's Avatar
 
"Lucan"
Dec 2006
England

145128 Posts
Default

Quote:
Originally Posted by jasonp View Post
See this thread and this thread for the latest on this oft-mentioned topic. Mods: can someone catalog the threads where this has come up, and make it a sticky or something?
In the light of the PS3 thread reincarnated by Jasong and Minigeek
(inter alia) the need for this is ever more pressing.
In particular, I could answer Minigeek's questions, but fairly
recently Ernst gave a very clear and authorative answer.

Last fiddled with by davieddy on 2008-05-14 at 16:22
davieddy is offline   Reply With Quote
Old 2008-05-14, 16:25   #7
jasonp
Tribal Bullet
 
jasonp's Avatar
 
Oct 2004

3×1,181 Posts
Default

Quote:
Originally Posted by davieddy View Post
In the light of the PS3 thread reincarnated by Jasong and Minigeek
(inter alia) the need for this is ever more pressing.
In particular, I could answer Minigeek's questions, but fairly
recently Ernst gave a very clear and authorative answer.
See the new sticky in this forum.
jasonp is offline   Reply With Quote
Old 2008-05-14, 16:37   #8
davieddy
 
davieddy's Avatar
 
"Lucan"
Dec 2006
England

194A16 Posts
Default

Quote:
Originally Posted by jasonp View Post
See the new sticky in this forum.
THX
davieddy is offline   Reply With Quote
Old 2008-06-18, 23:57   #9
James Heinrich
 
James Heinrich's Avatar
 
"James Heinrich"
May 2004
ex-Northern Ontario

2×17×103 Posts
Default

What about with the new nVidia GT200-series cards (GTX-260, GTX-280)?
Quote:
http://www.anandtech.com/video/showdoc.aspx?i=3334
...One of the major new features is the ability to processes double precision floating point data in hardware (there are 30 64-bit FP units in GT200)...
James Heinrich is offline   Reply With Quote
Old 2008-06-19, 00:27   #10
cheesehead
 
cheesehead's Avatar
 
"Richard B. Woods"
Aug 2002
Wisconsin USA

22×3×641 Posts
Default

Quote:
Originally Posted by James Heinrich View Post
What about with the new nVidia GT200-series cards (GTX-260, GTX-280)?
Caution: Note the references to heat problems in that article and the comments ("this card begs for a die shrink") below it. Wait for 55nm or 45nm versions.
cheesehead is offline   Reply With Quote
Old 2008-06-19, 01:43   #11
James Heinrich
 
James Heinrich's Avatar
 
"James Heinrich"
May 2004
ex-Northern Ontario

DAE16 Posts
Default

Not saying it's an ideal solution (and I'm certainly not paying $600 for a GIMPS-card), but finally some graphics cards with double-precision FPU are coming into the mainstream. As that trickles down to entry-/mid-range cards and a dieshrink or two later, it has potential, I think(?)
James Heinrich is offline   Reply With Quote
Reply

Thread Tools


Similar Threads
Thread Thread Starter Forum Replies Last Post
GIMPS/primeNet CPU overview refresh rate Meral Harbes Information & Answers 12 2014-12-03 11:45
Gimps/PrimeNet down? markg Information & Answers 4 2010-07-27 22:34
GIMPS vs. Primenet Stats petrw1 PrimeNet 11 2007-06-08 06:00
Graph over time of what kind of machines run GIMPS (PrimeNet) GP2 Hardware 18 2003-12-17 15:30
GIMPS PrimeNet / www.mersenne.org server relocation Old man PrimeNet PrimeNet 25 2003-04-23 20:24

All times are UTC. The time now is 06:58.


Fri Oct 22 06:58:21 UTC 2021 up 91 days, 1:27, 1 user, load averages: 1.31, 1.27, 1.26

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.

This forum has received and complied with 0 (zero) government requests for information.

Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.
A copy of the license is included in the FAQ.