mersenneforum.org bash and HTTP?
 Register FAQ Search Today's Posts Mark Forums Read

 2011-12-13, 02:47 #1 Dubslow Basketry That Evening!     "Bunslow the Bold" Jun 2011 40
 2011-12-13, 04:52 #2 Christenson     Dec 2010 Monticello 111000000112 Posts Dubslow: There are certainly 9 ways to skin the mfaktc automation cat...and a bash (or perl, or Ruby, or...oh, pick your language!) script can certainly be used. I seem to remember zombie processes as still having slots in the process table, but not doing execution or having remaining memory allocation. I don't think it would be hard under mprime to let you stop just one worker....you can certainly do it in the Windows GUI...
 2011-12-13, 06:42 #3 Dubslow Basketry That Evening!     "Bunslow the Bold" Jun 2011 40
2011-12-13, 16:18   #4
Xyzzy

Aug 2002

2·7·13·47 Posts

Quote:
 Random other question to get maximum thread efficiency: Is there a way to pause just on worker in MPrime?
In the past we have run four instances of Mprime to be able to control each one individually. It is a bit of a hassle but it works.

2011-12-13, 16:20   #5
Xyzzy

Aug 2002

855410 Posts

Quote:
 How would you do HTTP?

2011-12-13, 18:54   #6
chalsall
If I May

"Chris Halsall"
Sep 2002

2·72·113 Posts

Quote:
 Originally Posted by Dubslow I know Christenson's working on automating mfaktc, but I was wondering, is it possible to incorporate sending HTTP stuff via bash? I'd like to crap together something just for me to report results.
I don't know how to do this with just Bash, but I do know how to with Perl. Please see the Automatic Submission Spider for Workers thread.

It should give you a good idea about just how easy it is to build spiders using Perl. Also, hopefully, it will be useful.

 2011-12-13, 23:48 #7 Christenson     Dec 2010 Monticello 70316 Posts I think he would need wget or something to do it with just bash...
 2011-12-14, 17:58 #8 Dubslow Basketry That Evening!     "Bunslow the Bold" Jun 2011 40
2011-12-14, 18:26   #9
chalsall
If I May

"Chris Halsall"
Sep 2002

2×72×113 Posts

Quote:
 Originally Posted by Dubslow So I could construct my own HTTP requests with wget? That's what a quick google leads me to believe...
Sure. That's what wget does. But just to be pedantic, this isn't Bash doing the work -- it's wget being called from Bash (or whatever).

At your console (or a web browser), "man wget".

But I would still argue Perl with the LWP::UserAgent and HTTP::Cookies (optionally et al) modules is a better solution space.

 2011-12-14, 18:37 #10 Dubslow Basketry That Evening!     "Bunslow the Bold" Jun 2011 40
2011-12-14, 19:20   #11
chalsall
If I May

"Chris Halsall"
Sep 2002

1107410 Posts

Quote:
 Originally Posted by Dubslow I'm sure it is, but again, this isn't about being practical. I just want a tool to mess with HTTP, and interacting with PrimeNet seems like a great way to get started (I promise I won't spam/overload it) Also with bash I can do if [ $(wc -l results.txt) -ge 10 ]; then wget .... and I'm sure Perl would be great too, but... In Perl it's only a little more complicated: Code: @LC = split(" ", wc -l results.txt); if ($LC[0] > 10) {
...
}
But I must warn you again, half of the work of spidering is dealing with unexpected results and errors.

Nearly the entire other half has to do with dealing with parcing the responses. How do you plan on handling the regular expressions? Sed? Or Awk?

 Similar Threads Thread Thread Starter Forum Replies Last Post wombatman Software 10 2018-04-09 02:10 henryzz Software 11 2017-07-28 21:24 EdH Linux 11 2016-05-13 15:36 CRGreathouse Software 16 2009-03-26 08:42 jasong Programming 1 2007-11-29 05:59

All times are UTC. The time now is 06:36.

Tue Jan 31 06:36:02 UTC 2023 up 166 days, 4:04, 0 users, load averages: 1.24, 1.07, 0.93