![]() |
Chance to use modern Graphics Cards as..
Hi from Lucca, Italy! I'm a Prime95 user and i have a little question for everyone...
There's a possibility to make good use of modern's graphics processing units power in enhancing the speed of factoring Mersenne's numbers? Thanks in advance! Marco. |
I'm sure it's possible to program some graphics units to do trial factoring (even an HP-25 in 1976 could do that), maybe even stage 1 P-1 factoring. But I don't know whether their speed could come close to or exceed what current CPUs can do.
Can you investigate further? Study how trial factoring is done in current GIMPS software, then determine whether some graphics unit can perform the same or similar algorithm, and maybe try programming it yourself. :) |
Trick is to get the GPU to do the factoring (or LL, I think they support floating point in those things) in between everything else they do - and IN ADDITION to whatever the CPU is doing.
|
Now, now, let's not pile too much on a new guy's suggestion.
First, one needs to get the GPU to do factoring at all. [i]Then[/i] we can proceed with multitasking and optimizations. |
i dont know and im not an expert but does a video gpu even have a path back to the comptuer
i always thought it was a 1 way thing you sent it the data and the instructions adn it displaied whatever on the screen. dont get me wrong i think it would be awsome to have your vid card do some extra work ive thought about it in the past |
I am fairly sure you can send the neccessary information back to the computer. For example, you can take screen shots. However, I think I remember reading that the download off the graphics card is much slower than uploading too it. I don't think that would matter too much as long as you have sufficient video memory.
|
vid cards
I'd be very interested in this.
I rekon the GPU would be most useful - the main GPU I'm thinking of is the GeForceFX. It has 128bit FP units. Intel's FPU maxes out at 80bit. It also has programmable texture units. My thoughts are that you could configure the programmable texture units for the algorithims you need (factoring, LL) and then you initial values are constructed as a texture. Without any detailed analysis (i.e I'm talking out of my rear end :) ), I think the main benefit would be the insane parallellism. I'm guessing in one texture you could have multiple primes to be tested. I can't seem to find a raw benchmark for the GeForceFX cpu, like gflops etc.. to compare it to a P4. But I did find the GeForce FX has a memory through put of around 16GB/sec, where as a P4 with PC2100 ram (266 DDR) will have around 2.1GB/sec memory throughput. The GeForceFX CPU has more than twice the transistor cound of a P4 - 125million vs 55million. Like I said, I could full of it, but I'd like to hear other people's thoughts. -- Craig |
Step 1: . . . How does one get programming info for these devices?
It seems to me that there would be a lot to consider here. Windows at least detects video cards and uses them for the display. You wouldn't want some program to pop up a window and flush your work down the drain. Maybe you'd want two video cards - a simple one to provide a display while you decouple your fancy one from display duties so it can devote itself to the calculations. |
What about a custom card?
On a tangent, how difficult/expensive would it be to design and build a custom GIMPS-only add-in card which could do some (or all) of the following: TF, P-1 (stage 1 only?), LL ? Any ideas?
|
probably pretty up there as far as man hours and cost vs pay back and results
i was talking to some computer freinds of mine they said that gpu's are optomized for computing triangles much more so than your regular cpu they are specalized they do one thing very well and other things very mediocer( cant spell sorry) anyway if you could tie triangle caculation back to factoring or something like that then yea it should kick ass |
[quote="roy1942"]Step 1: . . . How does one get programming info for these devices?
It seems to me that there would be a lot to consider here. Windows at least detects video cards and uses them for the display. You wouldn't want some program to pop up a window and flush your work down the drain. Maybe you'd want two video cards - a simple one to provide a display while you decouple your fancy one from display duties so it can devote itself to the calculations.[/quote] roy - is there some sort of developers kit on the nvidia site? Or some form of a developers kit as part of Directx9? -- Craig |
| All times are UTC. The time now is 09:59. |
Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.