mersenneforum.org  

Go Back   mersenneforum.org > Great Internet Mersenne Prime Search > Hardware

Reply
 
Thread Tools
Old 2003-03-10, 17:52   #1
Marco
 
Mar 2003

2·3 Posts
Default Chance to use modern Graphics Cards as..

Hi from Lucca, Italy! I'm a Prime95 user and i have a little question for everyone...
There's a possibility to make good use of modern's graphics processing units power in enhancing the speed of factoring Mersenne's numbers?
Thanks in advance!

Marco.
Marco is offline   Reply With Quote
Old 2003-03-10, 18:36   #2
cheesehead
 
cheesehead's Avatar
 
"Richard B. Woods"
Aug 2002
Wisconsin USA

22·3·641 Posts
Default

I'm sure it's possible to program some graphics units to do trial factoring (even an HP-25 in 1976 could do that), maybe even stage 1 P-1 factoring. But I don't know whether their speed could come close to or exceed what current CPUs can do.

Can you investigate further? Study how trial factoring is done in current GIMPS software, then determine whether some graphics unit can perform the same or similar algorithm, and maybe try programming it yourself. :)
cheesehead is offline   Reply With Quote
Old 2003-03-13, 04:10   #3
QuintLeo
 
QuintLeo's Avatar
 
Oct 2002
Lost in the hills of Iowa

26×7 Posts
Default

Trick is to get the GPU to do the factoring (or LL, I think they support floating point in those things) in between everything else they do - and IN ADDITION to whatever the CPU is doing.
QuintLeo is offline   Reply With Quote
Old 2003-03-13, 05:21   #4
cheesehead
 
cheesehead's Avatar
 
"Richard B. Woods"
Aug 2002
Wisconsin USA

22×3×641 Posts
Default

Now, now, let's not pile too much on a new guy's suggestion.

First, one needs to get the GPU to do factoring at all.

Then we can proceed with multitasking and optimizations.
cheesehead is offline   Reply With Quote
Old 2003-03-13, 23:28   #5
crash893
 
crash893's Avatar
 
Sep 2002

23·37 Posts
Default

i dont know and im not an expert but does a video gpu even have a path back to the comptuer

i always thought it was a 1 way thing you sent it the data and the instructions adn it displaied whatever on the screen.

dont get me wrong i think it would be awsome to have your vid card do some extra work ive thought about it in the past
crash893 is offline   Reply With Quote
Old 2003-03-14, 07:01   #6
adpowers
 
adpowers's Avatar
 
Sep 2002

24×5 Posts
Default

I am fairly sure you can send the neccessary information back to the computer. For example, you can take screen shots. However, I think I remember reading that the download off the graphics card is much slower than uploading too it. I don't think that would matter too much as long as you have sufficient video memory.
adpowers is offline   Reply With Quote
Old 2003-03-16, 01:53   #7
nucleon
 
nucleon's Avatar
 
Mar 2003
Melbourne

5×103 Posts
Default vid cards

I'd be very interested in this.

I rekon the GPU would be most useful - the main GPU I'm thinking of is the GeForceFX.

It has 128bit FP units. Intel's FPU maxes out at 80bit. It also has programmable texture units. My thoughts are that you could configure the programmable texture units for the algorithims you need (factoring, LL) and then you initial values are constructed as a texture.

Without any detailed analysis (i.e I'm talking out of my rear end :) ), I think the main benefit would be the insane parallellism. I'm guessing in one texture you could have multiple primes to be tested.

I can't seem to find a raw benchmark for the GeForceFX cpu, like gflops etc.. to compare it to a P4. But I did find the GeForce FX has a memory through put of around 16GB/sec, where as a P4 with PC2100 ram (266 DDR) will have around 2.1GB/sec memory throughput.

The GeForceFX CPU has more than twice the transistor cound of a P4 - 125million vs 55million.

Like I said, I could full of it, but I'd like to hear other people's thoughts.

-- Craig
nucleon is offline   Reply With Quote
Old 2003-03-16, 04:22   #8
roy1942
 
Aug 2002

47 Posts
Default

Step 1: . . . How does one get programming info for these devices?

It seems to me that there would be a lot to consider here. Windows at least detects video cards and uses them for the display. You wouldn't want some program to pop up a window and flush your work down the drain. Maybe you'd want two video cards - a simple one to provide a display while you decouple your fancy one from display duties so it can devote itself to the calculations.
roy1942 is offline   Reply With Quote
Old 2003-03-16, 17:36   #9
apocalypse
 
Feb 2003

2568 Posts
Default What about a custom card?

On a tangent, how difficult/expensive would it be to design and build a custom GIMPS-only add-in card which could do some (or all) of the following: TF, P-1 (stage 1 only?), LL ? Any ideas?
apocalypse is offline   Reply With Quote
Old 2003-03-16, 19:03   #10
crash893
 
crash893's Avatar
 
Sep 2002

23×37 Posts
Default

probably pretty up there as far as man hours and cost vs pay back and results




i was talking to some computer freinds of mine they said that gpu's are optomized for computing triangles much more so than your regular cpu

they are specalized they do one thing very well and other things very mediocer( cant spell sorry)


anyway if you could tie triangle caculation back to factoring or something like that then yea it should kick ass
crash893 is offline   Reply With Quote
Old 2003-03-24, 08:47   #11
nucleon
 
nucleon's Avatar
 
Mar 2003
Melbourne

10000000112 Posts
Default

Quote:
Originally Posted by roy1942
Step 1: . . . How does one get programming info for these devices?

It seems to me that there would be a lot to consider here. Windows at least detects video cards and uses them for the display. You wouldn't want some program to pop up a window and flush your work down the drain. Maybe you'd want two video cards - a simple one to provide a display while you decouple your fancy one from display duties so it can devote itself to the calculations.
roy - is there some sort of developers kit on the nvidia site? Or some form of a developers kit as part of Directx9?

-- Craig
nucleon is offline   Reply With Quote
Reply

Thread Tools


Similar Threads
Thread Thread Starter Forum Replies Last Post
Prime95 and graphics cards keithschmidt Information & Answers 45 2016-09-10 10:08
Modern C Dubslow Programming 15 2016-01-12 09:13
New Linux rootkit leverages graphics cards for stealth. swl551 Lounge 0 2015-05-08 14:06
Nvidia's next-generation graphics cards ixfd64 GPU Computing 22 2014-11-15 04:25
how do graphics cards work so fast? ixfd64 Hardware 1 2004-06-02 03:01

All times are UTC. The time now is 23:41.


Fri Nov 26 23:41:07 UTC 2021 up 126 days, 18:10, 0 users, load averages: 1.38, 1.24, 1.28

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.

This forum has received and complied with 0 (zero) government requests for information.

Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.
A copy of the license is included in the FAQ.