Difference between Web & Software Development

Is there any difference between web development and software development?

Actually, no, not that I can think of. A web developer is one who creates software programs that work on the web. What a program works on makes no difference to a developer except the interface. In this case, it’s the internet, servers and browsers. In other cases, it’s a desktop/laptop computer or mechanical and other electronic devices.

There are some differences, mostly language specific but I would also think (never done it myself) that a software developer needs to know an awful lot about the underlying operating system platforms and things like registry manipulation (in the case of Windows machines). With web development you really don’t need to worry about that kind of stuff most of the time.

I’m a web applications developer and I have always thought my job (as opposed to a software developer) is a lot easier because I don’t need to know most of that stuff. I still need to interact with the OS but not like them.

There are some very profound differences in the way you have to approach a web application as compared to a desktop application mostly in the fact that desktop applications are usually more robust and require a higher level of programming expertise, (memory management, i/o stream control and advanced data structures, etc…).

As for programming theories, methods and models, they are almost identicle as you can apply good programming practices across the board.

One major difference is that a web application is usually accessed via the HTTP protocol, which is stateless. You will either have to maintain state information on the server, or pass it along for each request (usually in a cookie).

This means a very different approach from ordinary desktop applications.

it’s like comparing a horse and a tractor.
You can get them to do pretty much the same thing, but when something breaks you can’t really fix them the same way. :stuck_out_tongue:

You have not shown any differences other than the interface. Whether it’s the web or an OS or a vending machine, software is programmed the same, only the interface changes.

recently moved into software development field from web development. Just new at it and find designing applications the biggest difference. There doesn’t seem to be much room for customisation like websites have with CSS.
So I’m finding it difficult to find how to create excellent looking applications.

Web applications usually do different things. I wouldnt say that it requires less expertise, i think a real web programmer would know just as much as a desktop application programmer. not necessarily about systems and memory, but they would know more about security, and validating the user input.

web applications are stateless, and require alot more thought about security.

Web applications i think also do different things, where as a word processor for example is an application that the user is using to do something for themselves, and a web application is usually an online interface between a business and their clients.

There used to be a huge difference. Back before web “2.0”, before “1.0” even. When pages were nothing more than HTML. HTML is not a programming language. It is a markup language. It’s whole purpose in life is to describe content.

As time went on, people tried to make their pages look better. Enter the world of table-based design.

Then people wanted other people to DO things with their pages. Enter JavaScript, PHP, etc.

So in the beginning, they were two separate entities. Now, not so much.

FWIW using Visual Studio (.NET) or [URL=“http://www.apple.com/macosx/features/xcode/”]Xcode (Cocoa) to develop a modern desktop app is not that hard in that it depends on how complex what you need to do is. It gets harder with increasing complexity of functionality and UI.

At the same time, if all you need is an intranet wiki, building it as a desktop app is not the most sensible choice - even if it was done it would be overkill.

Building a Photoshop clone is very complex - as Adobe will be proving not impossible to do on the web with RIA tools like Apollo or XUL or some use of Python, but a desktop app that has all that local horsepower still has some advantages for the time being if you really need to do something intensive.

For Apple, all your apps will look sorta like iTunes if you use Xcode well - which makes your app easy to pick up - a good thing. Same for .NET - the “canned” widgets, controls and layouts take some getting used to but you can make it slick with subtle gradients and whatnot - really look at those OS dialogs - they aren’t that complex visually - mostly tabs, sliders and checkboxes.

It is a good point that you are limited cosmetically when using premade controls and layouts but you can still make a solid looking app. I think we will see the tools get better as far as what they offer in control and design.

Interestingly XUL, Apollo and .NET all do or will offer ways to have an offline and online mode so the separations between desktop and web apps will soon narrow to the point of nonexistence. You can build a web app that works on the desktop also (and syncs back up when online if need be) or a desktop app with lots of web functionality - the close to the metal issues like I/O taken care of for you.

How has everyone missed the point of user accessibility? with regards to where the users can access the systems from… once you develop a web app, your users can access it from anywhere in the world. If you develop software that runs on a desktop, they must be in front of the computer (physically/virtually using some remote desktop technology)

Much to be said about security and protecting your assets when you develop for the web as well.

I agree with what other people are saying on here. I have been on both sides (web and software development). When you’re a software developer, you typically have to know a lot more about the OS. Web applications are stateless too, so that is a big difference.

Once again. Everyone is trying to say web and software development are different due to the interfaces. I am saying there is absolutely no differences between the two. You write functions, create data, move information and the ONLY differences between programming the web and programming the desktop is the interface between your program and the device or object. Be it a screen terminal or the web or a robotic arm, it’s all the same.

If you think that statement is wrong then answer this: if you write a program to output the display of a table of data, what is the difference between sending that data between a screen and the web or a handheld instrument or a cell phone? The only difference is the interface. Absolutely nothing else.

Web [application] development is a branch of software development. Desktop [application] development is another branch. Each branch has particular idiosyncrasies as well as best practises and caveats regarding design, development models and maintenance.

Due to the differing capabilities / limitations of each platform, different approaches are used to best leverage delivery / capture of information. Certainly these difference are most apparent on the interface level, but generally a software developer wouldn’t be confined to building the data access layer or business logic layer and not be involved in the UI layer at all.

Often a developer will have strengths in a particular platform or range of platforms and recognizing this is key to selecting the correct project / candidate match.

I think people in here are including html in with web programming
html is like… umm… windows forms to C#

I think the biggest fundamental difference in the two is that OS applications are client-based, whereas web applications are client-server based.

The languages used can differ - but not by much. After all, you can use PHP to develop a client-based application and, similarly, you can use C++ on your web server. Technologies such as .NET and Java are applicable to both platforms.

How you choose between web and OS depends on your requirements. There are certain things that OS apps do well (fast games, distributed processing, file handling) and certain things web apps do well (cross-platform, centralised data, multi-user).

But the boundaries are blurring. Many desktop apps have internet connectivity, and many web apps are offering facilities that were typically OS-based, e.g. GoogleDocs.

But it’s not often you see programmers crossing from one platform to the other. In my experience, I would say that Windows developers have a harder time moving to web development than the other way round. I think that’s primarily because:

  1. Web development uses a wider range of technologies, e.g. HTML, CSS, JS, PHP/.NET/Java/Ruby/Python/etc, XML, SQL, as opposed to one OS, one main development language, and perhaps a single database.
  2. Desktop developers can target a single OS and write code that is known to work on that platform (OK, so it’s not necessarily that simple, but you get the idea). Web developers don’t have that luxury: the user could be using any browser, any OS, have a slow connection, may have images/CSS/JS disabled, etc.
  3. A programmer who knows every HTML tag and CSS directive won’t be able to create a decent web page until they have experience of browser quirks, layout techniques, and limitations. It’s a bit of an art, and takes time.

But, ultimately, if you can program in one language, those skills can be transferred to another platform. Whether you like it is another matter.

can’t agree more

some other reasons are the fact that the speed of your web applications matter. There’s many people running the same application from one machine, if the code is moderate on your own desktop, I bet it’ll be slow when you have a few users on one server using the same application :wink:
I think web programming (remember, this is my opinion) teaches you to optimize your code as much as possible, and with desktop programming I honestly don’t care if it’s slow- cause it’s basically got a whole processor to itself (I know it doesn’t, but it doesn’t have to run multipul times like it would on the web)

I found it really easy to learn C# from PHP (took me a few hours to build a basic application the first time. That was a bot to delete myspace messages). The only thing I found hard was sockets, but after searching for hours on the web I finally found a nice class library for them to make them easy. lol :slight_smile:

then, when I grasped C#, asp.net was fine

but the problem is, with C#, I don’t think I’d be able to… umm… “hand code” the applications like I can with php. I’ll rely on the IDE for a long time I believe…

I’ve started on C++ recently, hand coding console applications. It’s a lot harder than C# applications, but it’s just like php!

so yeah. windows developers will have a hard time switching over to web applications. C++ developers wont have as much of a problem, due to PHP’s syntax. But still, the optimization side wont be easy…

after coding in PHP for a few years, it took me a year and a half to realise mysql queries are what will bog down most applications, and it took me two years to understand that RAM = God :wink:

after coding in PHP for a few years, it took me a year and a half to realise mysql queries are what will bog down most applications, and it took me two years to understand that RAM = God

nod nod, I can go through the same experience.

As part of my profession it has fallen on me once or twice to quantify the differences between the two roles, if any. I have written a white paper and a couple of advisories on this subject and if I could post them here, I would - contractual obligation and client confidentiality unfortunately forbids it. So I will attempt to the best of my ability to explain a few conclusions that a few focus groups made up of business representitives, developers and I managed to come up with.

This needs to be split into at least two areas of consideration:

[]Past and present situations.
]Corporate and personal expectation and reality.[/LIST]
(and various others… but the above encapsulate most of it)

In the days of yore, developing interactive non-static web applications tended to be the province of C++ developers via the CGI standard of the HTTP protocol. This required an in-depth knowledge of communications and the servers OS. Later this progressed to Perl implementations which started to allow abstraction of the hardware layers from the implementation allowing greater productivity. However Perl as a language suffers somewhat from a geeky image and never really got the popular vote.

It wasn’t until PHP, ASP and their ilk that web development started to come to the masses. ASP (with VBScript/JScript) and PHP are both exceptionally well abstracted from the hardware and only require a superficial knowledge of the underlying systems to be productive. Due to this, pretty much anyone who could string together an if statement and understood basic algebra could start creating dynamic websites.

More recently this is beginning to change again. Websites are beginning to look and work like applications (SaaS is a good example of the future) and the distinctions between Web and Desktop applications are beginning to blur. Developers are being required to have more than a superficial understanding of the hardware and systems they are using and this is actually beginning to become a problem that business is starting to notice.

The cohorts out there are beginning to mostly polarise into around four factions.

[*]Designers who became web developers who stick to their preferred technology (e.g. PHP/ASP) no matter where it takes them[/LIST]
[FONT=Verdana][COLOR=#000000]These individuals are usually those who are self employed or employed by smaller firms. Most of the larger corporations are beginning to ignore this group as many see PHP and ASP as inadequate.


[*][FONT=Verdana][COLOR=#000000]Designers who became web developers going back to near pure design.

This is by far the most common route for an ex-designer. These designers are quite sought after by the larger corporations as they at least understand development work, and can even work on client-side scripting.
[*][FONT=Verdana][COLOR=#000000]Developers who became web developers who stick to their preferred technology no matter where it takes them

This group is beginning more and more to be seen as specialists or more often as outmoded and unemployable. Most employments in this area now are becoming contracts or outsource roles (see point 1). Most developers who did this are widely (and possible erroneously?) considered to be individuals looking for an easy life in their day to day work.[/COLOR][/FONT][/LIST]

[*][FONT=Verdana][COLOR=#000000]Web developers or developers who became web developers going to near pure software development OR moving into SaaS, SOA and related technologies.

This is by far the most common path for developers, although those coming from web development backgrounds are finding the path harder as they have to re-learn and work around mistaken perceptions generated from their PHP/ASP pasts. Those coming from a more non-web background but moved into the web arena are the best equipped and the most sought after. [/COLOR][/FONT][/LIST]
In basic conclusion to the question
“…Is there a difference between a Web Developer
and a Software Developer?..”

In essence no, a software developer is a software developer, web or otherwise. However, if you consider ‘Web Development’ to be different from ‘Non-Web Development’ then historically yes, there is a difference, one that is beginning to blur.

A few years ago, Web Developers generally required less technical knowledge. Although the larger the system being developed, the more technical knowledge required, and coincidently this seemed to tie hand in hand with a scale that went like this:

The smaller a system, the more PHP, ASP and their ilk was used.
The larger a system, the less PHP, ASP and their ilk was used.

As applications got bigger, COM/COM+, Java and more recently .NET got used more, and this requires much more technical knowledge than that required to create a site using PHP, ASP and their ilk.

These days as ASP.NET and equivalents are taking their share of the web, higher skill levels are required if you want to go beyond the productivity basics supplied by their respective vendors.

As always there is the question of scale, with scale come’s complexity and that’s the basic truth of it.

Much of my work more recently has been to do with repair. It is an unfortunate trend that businesses and technical teams underestimate the cost and requirements of using the newer technologies and they can easily become out of their depth.

Teams who have happily worked for a decade with the script based technologies with a smattering of back-end components have been rudely shocked by the differences when attempting upgrades. Unfortunately there is also a growing number of developers who are amazing with the older web technologies that are beginning to flounder, and it’s a sorry thing to see – especially in the UK where training is a rare commodity.

As a personal view therefore I must say that yes, there is a difference and it costs business a lot money. Hopefully time will remove those differences.