![]() |
I think I've brought this issue up before, maybe via PM (I don't feel like searching for it now). The wiki license is CC-BY-NC-SA 3.0, which I believe was inherited from the old Mersenne Wiki. I have several related questions following on from that basic premise:[list=1][*]Is this the desired license for the wiki, if we were to have a free choice?[*]If not, is it worth deleting/rewriting from scratch the imported Mersenne Wiki pages to allow for a relicense? Is that even possible?[*]Is the potential ability to import templates/modules/CSS from Wikipedia/Wikimedia/MediaWiki enough of a reason to pursue a relicense?[*]The purely mathematical data pages (e.g. Riesel prime ###) are currently under the same license as the mostly text pages. Given that the mathematical data should not be eligible for copyright/should be PD, Is a dual license of CC0 (for safety, would have the same effect as PD) for the data pages and another license for the text pages appropriate? This is an argument in support of putting the data in a separate subsystem, e.g. Wikibase.[*]In fact, I could argue that the current license of the Prime-Wiki prevents its use by GIMPS users whose primary reason for contributing (and using the Prime-Wiki to look for related information) is to win a GIMPS research award or an EFF prize, since any editor of the wiki (as a copyright holder of some text) could consider that to be "commercial use". I don't think it would hold in court, and I hope none of us would be that nefarious, but it's sufficiently murky that it ought to be considered.[/list]
|
Karsten has indicated to me that a full migration to Wikibase is not in his current plans. However, Wikibase would be an excellent tool for some of the structured data in the Prime-Wiki. The multi-reservations specifically come to mind. Storing them in a Wikibase item would allow all of the fields to be edited on the same page, and it would eliminate the reuse of IDs for new projects, which is poor design. I'll design a rudimentary Wikibase instance focused on just the multi-reservations and related items on my personal wiki and publicize it.
|
1 Attachment(s)
I just finished importing the Riesel even-[I]n[/I] Liskovets-Gallot [I]k[/I]'s from the old RPPDb page into the wiki, and I wanted to add a table to the CRUS project page using a CSV file (attached). However, I'm getting a MIME error whenever I try to upload the file:
[quote]File extension ".csv" does not match the detected MIME type of the file (application/csv).[/quote] I don't know whether this is a browser issue (both Chromium and Firefox gave me the same error, and I also tried uploading it in Firefox as a plain txt with similar results), a server issue, or an upstream MediaWiki issue. |
[QUOTE=Happy5214;602851]However, I'm getting a MIME error whenever I try to upload the file.[/QUOTE]
I've updated the extension ExternalData to the newest version, but this did not eliminated the error. I've inserted into the mime.types file of the MediaWiki source the line [code] application/csv csv [/code] there was only an entry with "text/csv csv". I've uploaded a new version of the data file for [url='https://www.rieselprime.de/ziki/Proth_prime_small_bases_least_n']Proth primes of the form k*b^n+1, least n-values[/url] and no error occured. Try again now if the issue is solved for now. I've searched for an error like this, but found no solution, an update to a newer MediaWiki version is not needed (and could also create other issues then). |
[QUOTE=kar_bon;602890]I've updated the extension ExternalData to the newest version, but this did not eliminated the error.
I've inserted into the mime.types file of the MediaWiki source the line [code] application/csv csv [/code] there was only an entry with "text/csv csv". I've uploaded a new version of the data file for [url='https://www.rieselprime.de/ziki/Proth_prime_small_bases_least_n']Proth primes of the form k*b^n+1, least n-values[/url] and no error occured. Try again now if the issue is solved for now. I've searched for an error like this, but found no solution, an update to a newer MediaWiki version is not needed (and could also create other issues then).[/QUOTE] The issue is now fixed, thank you. I looked at the current version of the MediaWiki code while investigating the issue, and the newer versions appear to have this problem too (they only list [C]text/csv[/C] as a valid CSV MIME type), so an upgrade would not have fixed this issue (instead causing many others). I don't know if something changed in the browser support for CSV files, but I'm probably going to file a bug report with MediaWiki about this. |
I'll preface this by saying that I haven't discussed this with anyone, but since you've already thought of the idea of referencing FactorDB numbers as a Prime-Wiki namespace, I'm wondering if you'd be open to the idea of using PW as a centralized database of ECM progress for general factorization efforts? They would be added as either remarks or a new field to long numbers, using the standard wiki process.
|
I don't know exactly, what do you mean here? Can you give an example, please.
|
[QUOTE=kar_bon;603012]I don't know exactly, what do you mean here? Can you give an example, please.[/QUOTE]
As an example, [url]https://www.mersenneforum.org/showpost.php?p=602904&postcount=1006[/url] stated that ryanp performed 4200 ECM curves on [url]http://factordb.com/index.php?id=1100000002997883099[/url] (the cofactor of 1528152860898312226820507829734311038694803153043007^5-1) at B1=26e7. We could have a "long number" page for that linked FactorDB entry (or, alternatively, the entry for 1528152860898312226820507829734311038694803153043007^5-1, which would likely be more stable if additional factors are found but the number isn't completely factored) with a history entry in the remarks saying something like "2022-03-30: Performed 4200 curves at B1=26e7 by Ryan Propper", with a link to the forum post. |
I think this is not the right place to store such information.
Questions: Who should put those intormation in? ryanp or Batalov? Those information are even more unverifiable than for say Riesel numbers. The amount of such numbers are big, especially for the OPN roadblock files. The work to create a page for such number is more time-consuming than the benefit any person can have from. How will those numbers be stored? How find them in the Wiki? Storing by type or length or project? There was such information stored in the FactorDB but not available? [URL="https://stdkmd.net/"]Studio Kamada[/URL] does a good job for documenting ECM efforts, as [URL="https://stdkmd.net/nrr/c.cgi?q=61666_272"]example[/URL] for (185*10^272-2)/3 (the same number in [URL="http://factordb.com/index.php?query=%28185*10%5E272-2%29%2F3"]FactorDB[/URL]): - you can reserve such numbers - automated generated ecm-call available - fill in your amount of ecm done This system would be best for documenting such work, but not the Wiki I think. |
[QUOTE=kar_bon;603262]I think this is not the right place to store such information.
Questions: Who should put those intormation in? ryanp or Batalov? Those information are even more unverifiable than for say Riesel numbers. The amount of such numbers are big, especially for the OPN roadblock files. The work to create a page for such number is more time-consuming than the benefit any person can have from. How will those numbers be stored? How find them in the Wiki? Storing by type or length or project? There was such information stored in the FactorDB but not available? [URL="https://stdkmd.net/"]Studio Kamada[/URL] does a good job for documenting ECM efforts, as [URL="https://stdkmd.net/nrr/c.cgi?q=61666_272"]example[/URL] for (185*10^272-2)/3 (the same number in [URL="http://factordb.com/index.php?query=%28185*10%5E272-2%29%2F3"]FactorDB[/URL]): - you can reserve such numbers - automated generated ecm-call available - fill in your amount of ecm done This system would be best for documenting such work, but not the Wiki I think.[/QUOTE] AFAIK FactorDB never had in-progress ECM progress data for yet-to-be-factored composites (at least I'd never seen it), just the data for the curve that found the factor. While I agree a purpose-built site like Studio Kamada would be better suited, that particular site appears to only cover the near-repdigit effort, so that doesn't appear to be useful in its current form to other efforts (e.g. OPN, high-importance aliquot, Cunningham). |
All times are UTC. The time now is 01:31. |
Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2023, Jelsoft Enterprises Ltd.