![]() |
![]() |
#1 |
P90 years forever!
Aug 2002
Yeehaw, FL
2·17·241 Posts |
![]()
Are all the factoring results from 79.3M to 1000M readily available?
I have 255MB of data from a year ago. |
![]() |
![]() |
![]() |
#2 |
Jul 2004
Nowhere
809 Posts |
![]()
i have a ton of old results...
i can zip them down really small and send them to you est size is around 300 megs compressed. Last fiddled with by moo on 2007-06-19 at 02:19 |
![]() |
![]() |
![]() |
#3 |
P90 years forever!
Aug 2002
Yeehaw, FL
2·17·241 Posts |
![]()
Do you have somewhere you can upload it to, so that I can download it by ftp? I think 300MB will be rejected by my email provider.
Please be aware that the v5 Primenet server is about to become the central repository for all factoring data up to exponent 1 billion. |
![]() |
![]() |
![]() |
#4 |
Aug 2002
206008 Posts |
![]()
Email them to use in 19MB chunks, preferably compressed, and we'll upload them to the server for GW to download.
If you have access to a real operating system, and the file is literally one big file, you can bzip2 the file and then use split to put it into chunks. If they are a group of files then just tar them, bzip2/gzip them and then split them. |
![]() |
![]() |
![]() |
#5 |
Jul 2004
Nowhere
11001010012 Posts |
![]()
Is there an ftp drop off xyzzy that i could use you can pm me the info.
|
![]() |
![]() |
![]() |
#6 |
Jul 2004
Nowhere
11001010012 Posts |
![]()
correction i have 700 mb of stuff it will zip a whole lot smaller
im also working on 1000m to 2000m slowly heat is an issue.... |
![]() |
![]() |
![]() |
#7 |
Aug 2002
27×67 Posts |
![]()
All we have is Gmail. Everytime we set up FTP on the server here we break something and the trolls get real pissed off.
![]() |
![]() |
![]() |
![]() |
#8 |
Jul 2004
Nowhere
809 Posts |
![]()
ok well i will send you an email with a dl link to the file
|
![]() |
![]() |
![]() |
#9 |
Jan 2003
Altitude>12,500 MSL
10110 Posts |
![]()
Moo, I retrieved and unzipped the archive. I will be parsing and rewriting the files into two v5.0 server-ready data files, one for factors, another for factoring progress.
It's not clear from quickly eyeballing the 700MB of data if there is redundancy in the coverage between various files. The server-ready files must not have redundant data coverage. What is the minimum list of files from the unzipped archive to produce the loadable server files? |
![]() |
![]() |
![]() |
#10 |
Jan 2003
Altitude>12,500 MSL
101 Posts |
![]()
Using a different method I'm loading all files unanalyzed, gaps, redundant, or not. We will verify the loaded factors, however. Thanks for the data!
|
![]() |
![]() |
![]() |
#11 | |
Jul 2004
Nowhere
809 Posts |
![]() Quote:
also you might want to make the database hold up to like 3000-4000m also you might want to make a project called the lmh double check project which checks factors found by pre v5 results files. |
|
![]() |
![]() |
![]() |
Thread Tools | |
![]() |
||||
Thread | Thread Starter | Forum | Replies | Last Post |
NFS filtering in the era of big data | jasonp | Msieve | 36 | 2018-05-07 19:55 |
Corrupted data? | Oops! | Information & Answers | 2 | 2013-10-22 03:48 |
GPU TF vs DC/LL data | bcp19 | GPU to 72 | 0 | 2011-12-02 16:41 |
Conflicting data? | ATH | Data | 4 | 2006-02-27 13:53 |
Have I submitted bad data? (Or, well, about to anyway) | Nazo | Software | 5 | 2005-08-06 05:44 |