Native JavaScript Equivalents of jQuery Methods: the DOM and Forms

Contributing Editor
This entry is part 1 of 2 in the series Native JavaScript Equivalents of jQuery Methods

Native JavaScript Equivalents of jQuery Methods

The debate is still raging on my recent Do You Really Need jQuery? article but, in summary, there are two reasons why using jQuery is a good idea:

  1. you need to support IE6/7/8 (remember you can’t migrate to jQuery 2.0), or
  2. without jQuery, you’d spend longer writing a jQuery-like library than developing your application.

For everything else, be pragmatic. jQuery is a 270Kb generic library. You are unlikely to require all the functionality it provides and, even if you omit certain modules, it remains a sizable quantity of code. You may load the 30Kb minified version from a CDN but the browser must halt processing and parse the code on every page before doing anything else.

This is the first in a series of articles showing native JavaScript equivalents of common jQuery methods. While you might wish to wrap some of these in shorter alias-like functions, you certainly don’t need to create your own jQuery-like libraries.

DOM Selectors

jQuery permits DOM node selection using CSS selector syntax, e.g.

// find all paragraphs with the class "summary" in the article with ID "first"
var n = $("article#first p.summary");

The native equivalent:

var n = document.querySelectorAll("article#first p.summary");

document.querySelectorAll is implemented in all modern browsers and IE8 (although that only supports CSS2.1 selectors). jQuery has additional support for more advanced selectors but, for the most part, it’ll be running document.querySelectorAll inside the $() wrapper.

Native JavaScript also provides four alternatives which will almost certainly be faster than querySelectorAll if you can use them:

  1. document.querySelector(selector) — fetches the first matching node only
  2. document.getElementById(idname) — fetches a single node by its ID name
  3. document.getElementsByTagName(tagname) — fetches nodes matching an element (e.g. h1, p, strong, etc).
  4. document.getElementsByClassName(class) — fetches nodes with a specific class name

The getElementsByTagName and getElementsByClassName methods can also be applied to single nodes to limit the result to descendants only, e.g.

var n = document.getElementById("first");
var p = n.getElementsByTagName("p");

Let’s do some testing. I wrote a small selection of scripts to retrieve all the comment nodes from my Do You Really Need jQuery? article 10,000 times. The result:

code time
// jQuery 2.0
var c = $("#comments .comment");
4,649 ms
// jQuery 2.0
var c = $(".comment");
3,437 ms
// native querySelectorAll
var c = document.querySelectorAll("#comments .comment");
1,362 ms
// native querySelectorAll
var c = document.querySelectorAll(".comment");
1,168 ms
// native getElementById / getElementsByClassName
var n = document.getElementById("comments");
var c = n.getElementsByClassName("comment");
107 ms
// native getElementsByClassName
var c = document.getElementsByClassName("comment");
75 ms

I can’t claim strict laboratory conditions and it doesn’t reflect real-world usage but, in this situation, native JavaScript was up to 60x faster. It also illustrates that fetching nodes by ID, tag or class will normally be preferable to querySelectorAll.

DOM Manipulation

jQuery offers several methods for adding further content to the DOM, e.g.

$("#container").append("<p>more content</p>");

Beneath the surface, jQuery uses the native innerHTML method, e.g.

document.getElementById("container").innerHTML += "<p>more content</p>";

You can also use DOM building techniques. These are safer but rarely faster than innerHTML:

var p = document.createElement("p");
p.appendChild(document.createTextNode("more content");
document.getElementById("container").appendChild(p);

We can also remove all child nodes in jQuery:

$("#container").empty();

The native equivalent using innerHTML:

document.getElementById("container").innerHTML = null;

or a small function:

var c = document.getElementById("container");
while (c.lastChild) c.removeChild(c.lastChild);

Finally, we could remove the whole element from the DOM in jQuery:

$("#container").remove();

or native JavaScript:

var c = document.getElementById("container");
c.parentNode.removeChild(c);

Scalable Vector Graphics

The core jQuery library has been developed to work with the current document. SVGs also have a DOM, but jQuery does not offer direct manipulation of these objects because it’s normally necessary to use methods such as createElementNS and getAttributeNS. It can be made to work and several plug-ins are available, but it will be more efficient to roll your own code or use a specialized SVG library such as Raphaël or svg.js.

HTML5 Forms

Even the most basic web application will have a form or two. You should always validate user data server-side but, ideally, you’ll supplement it with client-side validation to capture errors before the form is submitted.

Client-side validation is straight-forward:

  1. You run a function when the form is submitted.
  2. If any issues are encountered, you halt the submission and show an error.

You can use jQuery. You can use native JavaScript. Which should you choose? Neither.

HTML5 has built-in support for various common input types such as emails, telephones, URLs, numbers, times, dates, colors and custom fields based on regular expressions. For example, if you want to force the user to enter an email address, use:

<input type="email" name="email" required="required" />

There’s no need for additional JavaScript or jQuery code unless you require a little more sophistication such as comparing two or more fields or showing custom error messages.

Older browsers (including IE9 and below) do not understand the new types and will revert to standard textual input fields. Those users will fall back to server-side validation; not a great experience but you can apply a shim or hope those people see the light and upgrade.

In my next article, we’ll examine native CSS class manipulation and animation.

Native JavaScript Equivalents of jQuery Methods

Native JavaScript Equivalents of jQuery Methods: Events, Ajax and Utilities >>

Free book: Jump Start HTML5 Basics

Grab a free copy of one our latest ebooks! Packed with hints and tips on HTML5's most powerful new features.

  • http://niteodesign.com Blake Petersen

    Wow! 60x performance increase, that’s pretty amazing. May never us jQ again!

  • DaveMaxwell

    In your last two speed examples, you’re not going to get the same results – you’re not limiting them to the elements with the class of comment which are children of the element which has a id of element. The parsing of that resulting array will be slower because the result set will be larger and the parent id will need to be checked each time.

    The 107 ms example can be easily fixed by changing var c = document.getElementsByClassName(“comment”); to var c = n.getElementsByClassName(“comment”);

    The 75 ms example would have to include parent check, which might be a bit slower (it’d be interesting to test that)

    • http://www.optimalworks.net/ Craig Buckler

      Sorry Dave — there was a mistake in the second to last example and it should have used ‘n’ as the root element (the actual code used and test result was fine, though).

      In this particular test we were using a WordPress-generated page so there would never be any .comment elements outside the #comments section. However, I wanted to run a few scenarios to check the speed differences if that couldn’t be guaranteed. It’s safe to say that a parent check would certainly be slower since we’d need to perform the validation and push elements to a new array.

  • http://www.jontetzlaff.com Jon Tetzlaff

    I think there are quite a few situations where us as developers choose the easy route with jQuery when we could use vanilla JavaScript. You can achieve some nice performance and load time increases if you choose to just use pure JavaScript. Always something we should be considering.

  • Dennis Baskin

    First, I want to say that this is a great article and I cannot wait to see the rest of the series. I also want to comment on real world situations vs lab results that you mentioned.

    I’ve worked with old .NET pages, webforms, that use viewstates and when these pages are written poorly, they can produce huge chunky viewstates in the DOM. As for jQuery being involved, we had a large navigation menu, that was dynamically built, on the dashboard side of the product. I was using jQuery to append DOM nodes in a recursive function. There were only about 150 -250 inserts, based on the user and it was fine for most pages but one. One legacy page had a viewstate so huge and unnecessary that it made jQuery DOM insertion crawl. When we tried switching to native javascript functions it was by far faster, and string building was faster still.

    Just thought I’d put my two cents in. Maybe others still work with viewstates in .NET, and it is something to be aware of as well.

    • http://www.optimalworks.net/ Craig Buckler

      Thanks Dennis. I never liked the viewstate in .NET and always disabled it. It was an interesting idea but reasonably complex forms could force it to exceed several megabytes.

      With regard to efficiency, browser redraws are computationally expensive so it’s better to build an in-memory DOM fragment then apply the whole section to the page in a single operation. That’s possible in jQuery but the concise nature of the library gives the impression commands are optimized when they’re not.

  • http://www.meldville.com/ ket

    What do you replace this with javascript:

    $.ajax({
    type: “POST”,
    url: url,
    data: data,
    success: success,
    dataType: dataType
    });

    It’s the only reason that keeps me using jQuery.
    Thanks,

    • http://www.optimalworks.net/ Craig Buckler

      Keep reading SitePoint. It’ll be covered in part 3 of this series!

  • Lewis

    I glad to see this carrying on and being taken seriously.

    The speed improvements do not surprise me and this is before you take into account all those pages with multiple versions and multiple libraries on one page.

    It’s not just the speed though, it’s the flexibility you get. You can customise to your hearts content going native.

    Yesterday I wrote a Javascript image magnifier based on something a client had seen on another site. The plugin they used was on sale for $60, you could install that and then try you best at customizing it, or like I did piece together tested functions from other scripts I have written or found online to make exactly the same thing completely customised to the clients branding and site layout.

    It probably took a few hours longer but it’s a better result and the more you develop javascript the less hurdles you face each time, especially with the worse offending browsers now slipping away.

  • Les

    Wow… Such a performance gain, let’s all just drop JQuery and go back to the bad ‘ol days of having to invest time and (someone else’s) money in developing client side Javascript, yer? So …. what you shave some millionth of a second of using native Javascript, the benefit of JQuery overall saves so much hassle for everyone, a few millionth’s of a second isn’t ever going to be missed.

    It’s those times like 5 or 6 years ago or so when people said use print() instead of echo() for a performance gain, remember? What a …. load of horse manure, really people, do we having nothing better to do with our time (life?) than to waste it making comparisons over a few lousy …. millionth’s of a second?

    Christ.

    • http://www.optimalworks.net/ Craig Buckler

      A few milliseconds? A millisecond is a thousandth of a second — not a millionth. jQuery took more than four seconds in this test. Native JavaScript was instantaneous. That *is* noticeable in a client-side application. The test certainly doesn’t reflect typical real-world usage, but it does illustrate the point that using jQuery won’t magically make your code efficient despite the concise chaining syntax.

      There are undoubtedly benefits if you’re using jQuery to support IE6/7/8. There may also be benefits in using jQuery for a large application. However, you seem to be clinging to the notion that JavaScript development is too difficult or inconsistent in modern browsers. If anything, modern HTML5 development results in writing less code and reducing page weight. How is that bad for you or your clients?

  • Jeroen

    Great article!

    But I fear you are fighting a battle you can’t win Craig. Javascript is one of the easiest programming languages to learn, but one of the hardest to master. After 3 years of hard work and writing tons of lines of Javascript, I still end up hacking at code instead of writing it.

    ‘Javascript developers’ often think they are using jQuery but end up doing the wrong thing with the wrong tools for the wrong reasons. (Been there, did that a lot…)
    They write code
    $(‘.my-selector’).doSomething();
    $(‘.my-selector’).doSomethingElse();
    $(‘.my-selector’).andLetsDoSomethingAgain();
    or
    $(‘.my-selector’).doSomething().doSomethingElse().andLetsDoSomethingAgain();
    thinking it is well written, easy to maintain and scale, very fast,…

    That is simply frustrating.
    If you then tell those ‘developers’ to use Vanilla javascript and learn Prototypes, late binding,… I fear that at the end of the day, programs will be slower, harder to debug and full of exotic code that hacks at a problem that wasn’t there in the first place.

    So to end: stupid article Craig! you should never have written it.

    • http://www.optimalworks.net/ Craig Buckler

      Thanks Jeroen.

      I admit it takes some time to appreciate JavaScript, especially if you’ve come from a standard C-like language such as Java, PHP or C#. However, unlike a few years ago, there are many more tutorials, books and videos to help.

      If you understand JavaScript, you’ll understand jQuery and it’ll make you a more proficient programmer no matter which library you use. Unfortunately, the opinion that jQuery is somehow a different or better language than JavaScript is pervasive and many developers will not consider running naked and free!

  • Kenny Landes

    I think this is one of the most useful articles I’ve ever seen from this author. Thank you.

    • http://www.optimalworks.net/ Craig Buckler

      Thanks. It’s only taken me 840 articles to get there!

  • http://dummiesmind.com Pawan

    I liked your previous article “Do You Really Need jQuery?”. Great job! But I some how feel it’s just like the way people used to argue why use Java when C language is so fast. But we all know that we get flexibility, clean code and good ways to write code. Also we know that jQuery has been tested so we can trust it. I agree that performance matters a lot. But I some how feel that jQuery has done a great job to promote JavaScript and allowed developers to trust and take JavaScript seriously as a programming language.

    • http://www.optimalworks.net/ Craig Buckler

      jQuery has certainly been tested. But so have the native browser APIs which jQuery uses. You’re effectively cutting out the middle-man.

      Admittedly, some newer browser APIs are inconsistent but the jQuery core won’t help with those anyway.

  • http://0tocash.com Dave

    The 3rd reason to use jQuery is that it’s a required include of a lot of popular frameworks and libraries.

  • http://bkuri.com Bernardo Kuri

    Excellent article, Craig. I would really like to know if using Zepto (or MooTools) instead of JQuery makes things any faster. Sure, native JS will always win performance-wise, but I’m curious to see if any other library improves things a bit more when compared to JQuery.

    Finally, I believe the almost 5 second delay of your first test may be due to the fact that you didn’t exactly provide a scope (at least not the way JQuery likes it). A more correct (and I bet much faster) version of that test would’ve gone like this:

    var c = $(“.comment”, “#comments”);

    Keep up the great work. Looking forward to the rest of the series on this topic.

    • http://www.optimalworks.net/ Craig Buckler

      Thanks Bernardo.

      I’ve just tried var c = $(“.comment”, “#comments”); and it took 2,600ms for 10,000 iterations. Interestingly, it was faster than just $(“.comment”), but it was still 34x slower than native JS.

      The primary reasons jQuery is slower are:

      1. You’re calling a function rather than a native API.
      2. jQuery parses the selector string. Not as much as it used to, but there’s some processing.
      3. The resulting node collection is converted to an array.

      Most of the libraries do similar things and, if anything, jQuery is often faster than the competitors.

      • http://ashishkumar.asia Ashish

        You are right. It’s great to use native JS instead of other external plugins, but jQuery is not only to select the DOM elements. this provides many things like ajax, animations and many much things, the best part of these things in jQuery is CROSS BROWSER SUPPORT, even in old browsers like IE6 and 7.

        But, everything depends on the audience of the product you are making, right?

      • http://www.optimalworks.net/ Craig Buckler

        Thanks Ashish. Part 2 (CSS manipulation) has just been published with events, Ajax and utilities coming soon.

        A major reason for using jQuery is cross-browser support. However, jQuery 2 has dropped IE6/7/8. So, do you still need jQuery given that all the other browsers are consistent?

  • http://sww.duplou.com Carlos Adan

    I always thought you should always check input security with html, javascript and server side…. just to be sure one fails. Didnt really get the point of not checking using JS. I think yo should never completely trust users.

    • http://www.optimalworks.net/ Craig Buckler

      You should definitely validate user data server-side (as mentioned in the article) but, to make things easier for the user, a client-side check is also recommended. You should never rely on client-side validation only — data can be spoofed.

  • Jason

    Speed in Development time vs Speed in Performance. Always a consideration on any project.

    We’re developing websites, not video games so performance can take a hit as far as I am concerned. If you’re adding that much JavaScript to greatly decrease performance, you’re probably doing it wrong to begin with.

    • http://www.optimalworks.net/ Craig Buckler

      Agreed, but how does the jQuery core make you a faster coder? Admittedly, the native APIs are rarely as concise as jQuery syntax, but they’re hardly excessive and you could easily write a few shortcut functions.

      Uncompressed, jQuery is almost 300Kb before you add any of your own code. And would you ever use the whole library? Almost certainly not. Development time is only quicker if you’re stringing lots of jQuery plug-ins together rather than writing your own code. Cut-and-paste coding has its place, but it doesn’t make you a proficient (or well-paid) JavaScript developer.

      Finally, average web page weights are approaching 2Mb. While you may have saved a few hours coding, thousands of users could be impacted every time they visit and daily hosting costs will increase accordingly. Is the “performance can take a hit” attitude always cost effective?

      • Jason

        It’s more so the MASSIVE collection of third party jQuery plugins. Everything seems to be written in jQuery, rightly or wrongly. Why re-code something? Especially when clients want something cheap and yesterday. We are competing with India and China more and more these days.

        If you use the jQuery library hosted on Google like a lot of people, chances are it’s already downloaded to cache, right? Plus, let’s not go sprouting 300kb everywhere when, 267kb uncompressed and 90kb compressed are the real sizes of the jQuery library at current for 1.10.1 (v2 is smaller). 33kb could be the entire theme of a site. 90kb sounds like a lot but could be half the current download speed of your average user – so half a second to download isn’t long of a wait.

        Most websites don’t need a full time JavaScript developer working on it, come on. If you are building an App, probably a good idea, depending on size but for your brochure sites chances are the designer will have a crack – rightly or wrongly ;)

        2MB?, I don’t know where you got that figure from?, depends on if you have some uneducated designer cutting up transparent PNG24 backgrounds images I suppose. Images are the real enemy here. CSS frameworks are also really big.

        But the jQuery library has grown into a monster, 250kb+ is really silly.

        And performance is going to be based on the size of the site and it’s audience, let’s face it the average web development company, the average business website, it’s not going to matter at all. Of course, performance needs to be taken into account but there is only so much you need to do with an average site.

      • http://www.optimalworks.net/ Craig Buckler

        It sounds as though we’re talking at cross-purposes.

        If you’re catering for the cheap-and-cheerful end of the web market, then you’re probably binding third-party themes and plug-ins together to make a quick buck. That’s fine, but (a) there’s little money in it, (b) it won’t make you a proficient programmer, and (c) you’ll never care about performance, standards, accessibility, SEO, etc. In fact, it will be more profitable to build an under-performing site which requires significant ongoing maintenance.

        The file size of the jQuery library actually has little impact on performance. As you correctly point out, images are normally larger. However, unlike images, all browser activities are halted while JavaScript is executed after download. Minified files, caching and CDNs do not make that process faster.

        And yes, average web pages reached 1.25Mb in late 2012 and are growing 30% per year:
        http://www.sitepoint.com/web-page-weight-2012/

        But none of this matters if you’re happy building average websites for an average web development company supplying average businesses. Although it begs the question: why do you have the time and inclination to visit SitePoint which is famous for advocating best-practice web development techniques?!

      • Jason

        “If you’re catering for the cheap-and-cheerful end of the web market, then you’re probably binding third-party themes and plug-ins together to make a quick buck. That’s fine, but (a) there’s little money in it, (b) it won’t make you a proficient programmer, and (c) you’ll never care about performance, standards, accessibility, SEO, etc. In fact, it will be more profitable to build an under-performing site which requires significant ongoing maintenance.”
        The top end of business is turning to India and China on a higher level for programming in general. The average local web design/dev shop doesn’t need to worry about that too much. I wasn’t very clear in my point, I was being very general with that outsourcing comment. I agree that banging out sites that are from template farms and code farms isn’t going to make money, build skills or what have you. Lots of people get started in this area so don’t be too insulting towards it Craig.

        “I thought I’d consult the HTTP Archive Report which collates technology statistics from 300,000 of the web’s most popular sites”
        That resource is beyond flawed. You can’t measure the average size of a website by the top 0.0000000000000001% of sites (fake number but I’m just not sure how many websites are out there). It’s like getting the top 100 income earners and saying the average wage is 2.5 billion dollars. Doesn’t work. Most of the top websites are going to have more content and more features.

        On most websites JavaScript won’t take very load or initialize after the page has loaded, the user will see further indications of loading with correctly implemented scripts (the circle loading thing used universally across devices).

        I don’t think Sitepoint is quite “famous” for what you think it is. It’s always been the forums.

        “Although it begs the question: why do you have the time and inclination to visit SitePoint which is famous for advocating best-practice web development techniques?!”
        Are you trying to be insulting or trying to talk down at me? It seems that way. I don’t really have the time to be here to be honest but you have to reach out and look for new information. Sitepoint isn’t on my “often visit” list of websites. I’m not against best practices and never said I was, you presented an alternative and I’m just saying for right now – the current way isn’t broken and is extremely practical – in terms of performance, development speed, usability and compatibility. What’s the problem?

        Browser vendors are the enemy, not fellow web developers. Cross browser compatibility is bloat, disgusting bloat.

      • http://www.optimalworks.net/ Craig Buckler

        Your points contradict themselves, Jason.

        1. You state you’re working on “average business websites” where “clients want something cheap” and you’re “competing with India and China” (on price) so “performance can take a hit”. But you now claim you’re not working on a code farm?

        I have nothing against you or anyone else using pre-written themes, jQuery, plug-ins etc. to create a large quantity of “average sites” where quality is not an overriding concern. In the same way, many people have a great career at McDonalds; they just don’t expect to receive a Michelin star.

        2. Next, you state “JavaScript won’t take very [long to] load or initialize after the page has loaded”. That depends on what the code’s doing and, in this case, you can’t guarantee performance because you didn’t write it. For example, many WordPress themes are a bloated mess with dozens of scripts. Besides, if JavaScript is always quick, why is there a need for “further indications of loading”?

        3. As for page weights, I presume many statisticians have quit following your detailed dismissal of sample size determination, confidence intervals and hypothesis testing. But why do you think popular sites have more content on a single page? The world’s most popular page shows a search box (although the total weight will surprise you). You can choose not to believe the surveys, but I urge you to look around and judge for yourself.

        4. You then claim not to visit SitePoint often, but don’t think it’s known for best-practice techniques?… Try raising a few anti-standards comments in the forums!

        5. Next: since when have vendors been “the enemy”? They want you to use their applications and develop sites. The browsers are closer than they’ve ever been, so why are you continuing to add “disgusting bloat” to get around compatibility issues which no longer exist?

        6. Finally, you clearly state performance is not your priority. Why are you defending this notion on an article which is about improving performance? No one’s forcing you to drop jQuery or consider alternative options. However, I would point out that the best developers always care about performance irrespective of financial reward.

      • Jason

        1) I’ve already corrected myself on this in my previous comment saying that I meant in general programmers (we, as programmers and businesses) need to compete with China and India. Some, not all, businesses, including places Microsoft outsource to these countries for higher profits. China and India doesn’t always mean poor quality. There are plenty of people on Sitepoint that swear they are fantastic. I have no experience in dealing with China or India or having to compete with them since my business is locally focused.

        Not all Local businesses have large cash supplies. I’ve worked with designers that do custom work and designers that use templates. Being an implementation expert it’s really up to me to work with what’s given to me. The amount of bought PHP modules I’ve had to fix is… beyond a joke. Same with jQuery modules and the template CSS. Lots and lots of fixing required. I’m all for standards and custom work but it’s not at all practical for most businesses.

        I have no intention of farting out sites for $500 but I don’t think I am better than people that do…

        2) Of course it depends. Yep I have seen plenty of wordpress sites bloated with crap. One of the reasons I have never really dealt with wordpress. And when sending/receiving information from one web server to another you should show an indication of loading, such as twitter feed or other webservice, surely you know this ;)

        3) When I code CSS, JS I keep it small, when I crop images… I keep them small without degrading quality too much. You know seeing which image format is the smallest size. Save those kbs! I’d be more interested in my own works average.

        4) Why would I do that? I have no reason to.

        5) They all get along now do they? You’re saying MS is improving IE for the good of humanity and not because they are losing massive market share? Anything I develop is done with the best interest of the user in mind.

        6) You started to engage me on my comment, it would be rude not to reply? I just made a comment with my thoughts. You replied with questions… I’m just replying… do you not want comments? As I originally stated “Speed in Development time vs Speed in Performance. Always a consideration on any project.”, it doesn’t have to be about money, it can just be about timing. You can fix a project later if need be.

      • http://www.optimalworks.net/ Craig Buckler

        I’m having trouble understanding your reasoning here?

        1. You state jQuery plug-ins offer fast, cost-effective development then admit you’ve had “lots and lots of fixing”? How has it saved time or money?

        2. We’ve not been discussing Ajax data requests, but scripts which must be loaded, parsed and executed. However you look at it, jQuery is not an inconsequential hit.

        3. I’m glad to hear you’re saving those Kb but not everyone is as conscientious as you. Cramming pages with unnecessary scripts is common.

        4. Of course not. But why do you think SitePoint isn’t well known for advocating best-practice techniques? Have we missed something?

        5. All the vendors are W3C members. There is no commercial advantage to building a browser which is different from the others. As for Microsoft, we’d be stuck with IE6 were it not for the competitors. But who cares why Microsoft upgraded IE: I’m glad they did. IE10 is as good as any other browser and normally less hassle than Chrome. (jQuery 2.x has more Webkit than Trident fixes).

        5a. “Anything I develop is done with the best interest of the user in mind.”
        How do you justify that against “performance can take a hit”? That’s helping you — not the user.

        6. I love receiving comments but you’re yet to answer my original question: how does the jQuery core make you a faster coder? In addition, why do you think performance always compromises development speed? We’ve been side-tracked by the other points, but these articles are about writing and using less JavaScript — not more.

      • Jason

        1) 500 lines of pre-written code to fix 1 line of bug. Not much of an issue.

        2) Indeed, loading an image slider isn’t going to kill the internet though.

        3) You have to know when to pick your battles. There is still that overlap of designer vs developer when it comes to frontend development.

        4) I didn’t say it wasn’t though I’ve never heard anyone say it… that said, I haven’t visited the forums in a while. But while I used to visit them often back in the day, the statement similar of “Sitepoint forums has a website” was thrown around jokingly (because they only visited the forums, not the site).

        5) They have a common commerical interest. Not saying it’s bad, it’s good but it is just business.

        6) Because it can mean writing less code, less time writing code = speed, if you know what you are doing. Performance doesn’t always compromise development speed, it can if you are obsessive about it when performance isn’t a key factor to project success! And I’ve said for the average site etc. it’s not going to cause failure to use jQuery.

        I think shortly into the future we won’t need jQuery, it’s size hasn’t crept up, it’s blown out which is a concern but a couple extra seconds on Bob’s Plumbing Co. isn’t going to lose sales, surely.

      • http://www.optimalworks.net/ Craig Buckler

        I think shortly into the future we won’t need jQuery

        If you’ve dropped oldIE support, that future is here today … hence the reason for this article!

  • http://www.cwsworldwide.com Empresa de Diseño Web

    Thanks for code, jquery is very important in web design today. Grettings form venezuela

    • http://www.optimalworks.net/ Craig Buckler

      Thanks Empresa, but I’m suggesting jQuery is less important today!

  • http://marriott.com Amen Moja Ra

    I agree with everything you have said in this article. And lynda.com has even dedicated a new video course to using native JavaScript when it comes to the DOM. I just have one question, how do you justify event handling? In your opinion do you think jQuery is preferable than just vanilla js. I think vanilla JS is better because writing functions and then attaching them to events is my preferable way. And since that is the case I think vanilla JavaScript is the way to go.

    • http://www.optimalworks.net/ Craig Buckler

      jQuery event handling is based on the standard API but it supplements it with some extra features. It’s good, but you rarely need it and can replicate many of the options by attaching handlers to the page body (which jQ does). Event handling will appear in part 3.

  • http://allmytechtalk.com Adam

    Very cool article that demonstrates the overuse of jQuery presently, when much faster JavaScript methods are just as easy to implement in some situations.

    Given the speed increases, this is evidence enough to rewrite some areas of sites and save on loading jQuery all together.