User Name
Password
AppleNova Forums » Third-Party Products »

My first application!


Register Members List Calendar Search FAQ Posting Guidelines
My first application!
Thread Tools
Towel
New Member
 
Join Date: May 2004
 
2004-05-18, 03:18

Well, not quite an application, more like a reasonably complicated Perl script. But it's my baby! My experiments with DNA arrays require a lot of number-crunching, and until now that meant a lot of rather laborious SQL scripts and Excel manipulations. I'm too cheap to buy the mega-bucks commerical software, and I want to do things they don't do anyway.

So I've been meaning for months to write something that would automate the whole process, and finally got down to it in a serious way this past weekend. Shell scripting could do much of it, but bash chokes on floating points (ack!) so I had to teach myself the rudiments of Perl. Muuuuch better. And I quickly discovered that I could call bash scripts from within the main Perl script, like subroutines. Pretty cool, and made calling MySQL a heck of a lot easier (I hate flat quotation marks!).

So the whole shebang is one 100-ish line Perl script, plus four small bash scripts and a couple of files containing my long, long lists of input variables that the script crunches through one-by-one. A little clunky, and even worse I use a few temporary files as variable storage because I can't always figure out a good way to get the three scripting languages (SQL, Perl, bash) to talk to one another. But clunky and all, she's my baby.

I know this sounds a little pathetic, but understand I am just a poor, dumb, wet-bench biochem grad student. This is a major advance in my harnessing of all that OS X goodness. One of these days (months from now...) I'll try rolling this into an actual Cocoa application that my lab-mates could use. I've been piddling with Cocoa on and off, but the learning curve is a bit steeper.

Ah. Nothing like the stress of an upcoming committee meeting to drive a grad student to new heights of productivity.

Did I mention OS X rocks!? BSD goodness, translucent Terminal windows, Expose...mmmm.
  quote
SilentEchoes
Unique Like Everyone Else
 
Join Date: May 2004
Location: Rochester, NY
Send a message via AIM to SilentEchoes  
2004-05-18, 05:16

Haha I know. I am a PHP-SQL developer. There is no way I could ever dream of doing that in OS 9 all on one box. Sure I could setup a home server and FTP the files over.. but having a whole separate computer with real OS on it is a bit of a stretch.

WARNING: Do not let Dr. Mario touch your genitals. He is not a real doctor.
  quote
Gargoyle
http://ga.rgoyle.com
 
Join Date: May 2004
Location: In your dock hiding behind your finder icon!
 
2004-05-18, 05:53

Quote:
Originally Posted by SilentEchoes
Haha I know. I am a PHP-SQL developer. There is no way I could ever dream of doing that in OS 9 all on one box. Sure I could setup a home server and FTP the files over.. but having a whole separate computer with real OS on it is a bit of a stretch.
Urrm that's why he uses OSX! You need to ditch that old OS 9 and join everyone else in the 21st century!
  quote
SilentEchoes
Unique Like Everyone Else
 
Join Date: May 2004
Location: Rochester, NY
Send a message via AIM to SilentEchoes  
2004-05-18, 06:00

I have, 4 years ago with DP4 I said there is no way I could ever dream of doing all that in OS 9. Meaning I have OS X and thank god I don't use that lame excuse for an OS known as OS 9.

WARNING: Do not let Dr. Mario touch your genitals. He is not a real doctor.
  quote
Gargoyle
http://ga.rgoyle.com
 
Join Date: May 2004
Location: In your dock hiding behind your finder icon!
 
2004-05-18, 07:40

ahh. *he he*
  quote
123
New Member
 
Join Date: May 2004
 
2004-05-18, 12:58

Quote:
Originally Posted by Towel
and even worse I use a few temporary files as variable storage because I can't always figure out a good way to get the three scripting languages (SQL, Perl, bash) to talk to one another.
The traditional unix way of doing this is to create scripts that read from stdin and write to stdout. You can then easily build chains (pipelines).

It looks like your main script is in Perl. You can use the open() command to either read from or write to processes. However, if you need to be able to read AND write, you can either use open2 (IPC::Open2 module) or pipe, fork, exec yourself.

Here's an example using open2:

Code:
#!/usr/bin/perl use IPC::Open2; my $pid = open2(\*RFH, \*WFH, 'grep', "w"); print WFH "asdljasdjaslj\n woeiruwoeru\n pweosapdo\n"; close WFH; #if you don't close WFH here, a read call to RFH might block forever. while(<RFH>) { print $_; } close RFH; waitpid($pid, 0);
However, this only works if the data that is being passed back to you is not too big (few Kb) before you start reading. It is no problem if the script you are calling first reads everything from stdin, then processes the data and then writes everything to stdout. But it can be a problem if the script reads some data, processes it, writes the result back to you then reads again etc. like grep does. You then have to interleave read and write calls which has its own set of problems (look up select() or non-blocking I/O if you have to do that).
  quote
Towel
New Member
 
Join Date: May 2004
 
2004-05-21, 01:42

Quote:
Originally Posted by 123
You can use the open() command to either read from or write to processes. However, if you need to be able to read AND write, you can either use open2 (IPC::Open2 module) or pipe, fork, exec yourself.
Interesting. I used open() to read and write to the files that my bash/sql scripts created. I don't think I needed to read+write simultaneosly, but that's pretty cool. My main trouble was figuring out to get the output I wanted (especially SQL results) into STDOUT for piping. I guess I never tried telling MySQL "into outfile 'STDOUT'", but I suspect that wouldn't work. So I wrote to a temporary file and used open() to read it.

Anyway, I finally got all the kinks worked out and ran my script for real this evening. It took just about an hour on a 1.42GHz DualG4. Most of that time was taken up by the 3200ish SQL queries, some of which took several minutes. I actually thought it might take much longer than just an hour, based on my preliminary runs. That DualG4 workhorse really works. All for a grand total of about 100k worth of output data files. But that 100k represents a nice chunk of my future manuscript.
  quote
Posting Rules Navigation
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Post Reply

Forum Jump
Thread Tools

« Previous Thread | Next Thread »

All times are GMT -5. The time now is 07:40.


Powered by vBulletin®
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004 - 2024, AppleNova