Ok i DO NOT want a big debate about which is better. I just would like clarification as to why some scripts work on Unix but not NT. is this the programers job, or is it have to do with the hosting company?
i bring this up becuase i'm at a crossraods where i havee to decided to go NT servers or Unix. i'm getting better at ASP but i guess iould learn PHP if i had to.
ok let the games begin!
E-mail me for a 1-1 talk about setting up your dream business.
Most scripts that send an email only work on UNIX because NT doesn't have something like sendmail and you would have to use an SMTP server which is beyond the skills of most
programmers. Other scripts use functions that don't work in NT such as rand() (with NT you must use srand() before you can use rand(), lots of programmers don't know that and just give up when rand() doesn't give a random number). Other reasons could be the fact that lots of people test and develop their scripts on UNIX because it is easier to
test them on UNIX (from the shell) and the programmer doesn't even know if they work on NT or not, so they just say they don't. Lots of programmers also like to use .db files using the dbmopen() function which is not supported on NT. NT IIS also requires that you print full HTTP headers instead of partial, most unix servers don't. So a script that works on UNIX might not on NT if the script doesn't print full HTTP headers.
There are a few other reasons the scripts might not work on NT but i forget them at the moment.
Well NT has a hard time with cgi processes, and that makes it an absolute hog when it comes to perl. It also has basically a more limited version of perl, so keep that in mind. NT is also *somewhat* less secure when it comes to permissions and cgi.
Lets face it, UNIX is just far better when it comes to processing cgi scripts, so when you have a cpu intensive cgi script on NT it will take a much faster server to process the same scripts, and is basically not a good choice for such things like my UBB ( or this one ).
As the systems coordinator of a regional office for a company with over 10,000 workstations and servers nationwide running on an intranet, I can say that NT can be as secure as a unix machine if it is setup properly. This requires NTFS partitions which all NT servers should be using. You can then set access rights the same as you would on a UNIX machine per directory, per file and per user/group.
The problem with perl CGI scripts is that they are predominantly created by people who don't have access to an NT server for development purposes. NT excels at CGI applications fully compiled to run natively. Most CGI on NT is created in C++ or some other 4th generation language. This enables you to access system services such as MAPI (email) and TAPI (telephone and fax) which PERL simply isn't able to do. NT is very strict about what processes are allowed to run unchecked. The problem is that running CGI programs is one of the largest potential security holes in any system because they are executed by a user with no security access yet allow that user to execute vital system functions such as read/write, email and faxing.
If you look.. Those companies providing CGI applications that are serious about their business offer both *NIX and NT versions of their scripts or have created scripts that work in both environments using modern programming techniques.