Dreamweaver 8

deesy58, Dreamweaver doesn’t attempt to render the HTML and CSS, the design window does nothing but place things on the screen as explicitly implied by the tool (Dreamweaver is not a rendering engine on it’s own accord, any visual positioning it provides is more in relation to the products strict control over how everything looks and feels on the page… not as to how it will appear once it’s published on the web. This is the clear difference between a rendering engine (as you would get in the preview window) and the WYSIWYG editor. The visual editor itself has a lot more control over how things look and feel (because it’s not trying to emulate a 3rd party products idea of what “following standards” is). The rendering engine mentioned earlier was Trident (which is the default rendering engine - a closed source one at that for Internet Explorer) and / or Webkit (an open source rendering engine used by Safari and Chrome). If Dreamweaver were to accurately portray the same results on screen in every browser (like IE), it would firstly need access to the source code of IE’s rendering engine (never going to happen), and then it would need to force Microsoft (somehow) to rewrite their rendering engine to render objects within the web browser as every other rendering engine would.

It’s simply not something that’s going to happen. Your version of Dreamweaver is alike IE6… it’s an older product using older code to try and follow standards which have since changed and get things looking as they would in modern browsers and old browsers alike… something which any professional will tell you is next to impossible. When the rendering engines and their various versions can’t even follow the standards entirely or correctly themselves, getting an equal look across all devices is not going to happen, especially if you leave your codes viability in the hands of an outdated piece of software which doesn’t know things have evolved. WYSIWYG means “what you see is what you get”, and that’s what the tool is and does… you drag stuff on the screen and it produces them, if you expect any WYSIWYG editor to be able to magically account for all possible rendering scenarios, understand contextual value (To the extent humans can) and be able to “know” what semantically makes sense… you’re out of luck, this isn’t the movie Terminator, we don’t have AI that can think like a human. It’s just a piece of software doing the best job it can… which is coincidently no substitute for human knowledge, understanding and the ability to craft something without leaving all the hard work to some visual tool which is about as reliable as using Microsoft Word to build a website. :slight_smile:

Alex mentioned something I think is very important

I can imagine the horror Rembrandt would feel if he saw some of his works displayed with some of the colors muted or of a slightly different hue, or if the canvas was stretched a bit or cropped.
And I understand that a designer can have pride in how their work looks and be equally desireous of having it render “pixel perfect” across all browsers. But even with rigorous and thorough testing the best that can be achieved is “satisfactory” not “exactly”. There are too many inconsistencies and variations at play to have anything better ATM. So if you use DreamWeaver, fine, but don’t assume that because it looks good in DreamWeaver, or even a few browsers, that it wil look identical in every browser.

During our discussions, I decided to take a look at a very small public Website that I have had hosted by a well-known hosting provider. The site consists of a very small amount of text telling the viewer that the site is under construction, along with a couple of small images. No CSS. No scripts.

I noticed that the page took more than 30 seconds to resolve and render with IE8, and more than 31 seconds with FireFox 3.6. In addition, Firefox only displayed the text, with no images.

After persistent e-mails to the hosting company’s Tech Support Department, I was told that my Web site had been “hacked,” and that some amount of PHP scripting had been added to the end of the XHTML.

I have been fighting this issue for the past two days. My local machine is clean, and it has been so for a very long time. In addition, I had not uploaded any code to the site since January 14, 2009, which is well before the supposed offending Gumblar worm was apparently released into the wild (April/May, 2009). Now, the hosting company is trying to convince me that it is Adobe products (Dreamweaver, Flash Player, Adobe Reader & Acrobat) that are causing the problems.

I had decided to try to find an academic upgrade for my Studio 8 to Designer CS4, but Academic SuperStore tells me that my universities are not participants in their special pricing program for students and teachers, and $800US is just too much from Adobe.

I found an upgrade on e-Bay for $349.95, but it comes from the Phillipines, and I am worried that it might either be counterfeit or stolen. Besides, with the articles in Forbes magazine and PC World, it sounds like Adobe has more security vulnerabilities in their code than Microsoft. :eek:

What’s a body to do? It sounds like these giant, irresponsible software companies are putting everybody at risk.

BTW, I had the interesting experience of being able to access the Adobe upgrade/update pages just fine with FireFox, but not at all with IE8. Is that strange, or what?

deesy

I’m really leaning towards your hoster being the cause of getting the worm, unless your passwords got grabbed. The site access itself has its very own password used nowhere else, right?

Adobe products may be buggy but I’ve not heard of them being insecure… not really sure how security can really play much of a role in a text-editor-on-steroids, except maybe somehow the ftp part. IE has always been buggy and prolly always will be, and because IE is still the most-used browser, it’s targetted more by the l33t kiddie h4x0rz out there.

with the articles in Forbes magazine and PC World, it sounds like Adobe has more security vulnerabilities in their code than Microsoft.

Was DW specifically mentioned? Cause they also make a (very expensive) back-end application platform and I can imagine that having security issues.

I’ve always had opposite trouble with Adobe pages: Firefox has the most problems accessing them and slower (I was blaming coldfusion but it must just be they hate linux). Chrome or IE on Windows was always ok (the Download the Reader and Download Flash pages I mean).

In the meantime, don’t give up on the free wysiwygs and other editors. Free doesn’t always mean crap. : )

I’ve been using Dreamweaver 8 until now and I am not sure if it looks good on IE8 since i’m on FF :smiley:

Adobe products may be buggy but I’ve not heard of them being insecure…

I’m not so sure because a while back I had three sites hacked with the iframe virus and each of the sites were on different servers/hosts and the only common link was that the ftp details were kept within DWs site management. It seems that there is a virus that specifically looks for these details as they don’t seem to be hidden that well.

The virus managed to get in past all my security and I had to get Mcafee to track and delete it via their removal service (quite expensive). I believe they have updated their software since to catch this one as I have had no problems since.

Hm… that was the one bit I thought might be hax-able:

…except maybe somehow the ftp part.

Maybe a good reason right then and there to drop DW8 then? You got an update to CS3 right?

Don’t be “on FF”, use ALL the browsers : ) They are free to download. check your site in all of them.

I’m really leaning towards your hoster being the cause of getting the worm, unless your passwords got grabbed. The site access itself has its very own password used nowhere else, right?

The hosting company established a new data center, and they moved my data to a server at the new site. They insist that their site has not been hacked, and I insist that my local machine has never been infected with the Gumblar worm. In order to add PHP to my XHTML, a hacker would have had to know my password for either FTP or cPanel to upload the malicious script. The real question is: How did they learn my password? I have subsequently changed my password, I secured an encrypted password vault, and I am in the process of changing all of my important passwords with randomly generated character strings produced by the password manager (LastPass).

Was DW specifically mentioned? Cause they also make a (very expensive) back-end application platform and I can imagine that having security issues.

I went back and reread the Forbes and PCWorld articles, and they seem to be focused on Adobe Reader and Acrobat. I believe that it was the hosting company that mentioned the other Adobe products.

In the meantime, don’t give up on the free wysiwygs and other editors. Free doesn’t always mean crap.

I have been searching diligently for a legal and fairly-priced copy of Adobe CS4 Design Premium, but Adobe is really gouging on the price. I am willing to pay a reasonable amount to upgrade my Studio 8, but the price demanded by Adobe for an upgrade is very extreme at more than double what I originally paid for my Macromedia Studio 8. Most of the products available on e-Bay are either used, pirated, or student versions that require certification that one is a student. I am not a student.

Interestingly, for whatever reason, I can now access all of the relevant Adobe Web pages with IE8. The fact that those pages would “freeze” my browser over a period of several days is very strange.

I’m not so sure because a while back I had three sites hacked with the iframe virus and each of the sites were on different servers/hosts and the only common link was that the ftp details were kept within DWs site management. It seems that there is a virus that specifically looks for these details as they don’t seem to be hidden that well.

The virus managed to get in past all my security and I had to get Mcafee to track and delete it via their removal service (quite expensive). I believe they have updated their software since to catch this one as I have had no problems since.

During my research, I ran across a number of horror stories posted by network administrators telling how difficult, or even impossible, it is to get rid of this particular worm and its offspring, all of which apparently originated in China almost a year ago. It apparently “morphs” on a regular basis, uses encrypted characters and base64 code. It’s driving administrators right up the wall. :mad:

deesy

Firstly, I’ll state factually that Dreamweaver hasn’t been infested with a worm, I’ve yet to see any proof of such a thing occurring. Though it probably could, the only products which get regularly exploited by such malicious software (due to their inherent connection to the Internet, and therefore the point of penetration) are Acrobat Reader, Shockwave and Flash… all of which are directly embedded into the browser (in some form such as a BHO - browser helper object or plug-in).

I have to disagree with you again with your claims that the software companies are being irresponsible. If you honestly believe that you can write a piece of software without bugs occurring (due to human error, component error or some clever method of sneaking something in) then you need to step back and take in some common sense. Humans are not perfect, they make mistakes, they can’t ensure that people won’t be able to exploit something they haven’t figured out. Hackers are intelligent people who spend a great deal of time scanning over code or how applications work trying to think of stuff these companies have failed to think of to be able to target or hijack a program (no matter how small the glitch is - even if that glitch wasn’t intentioned), they have the benefit of working in retrospect, no foresight required. There is no way you can expect something with tens (possibly hundreds) of millions of lines of code, especially ones which allow people to enter code - whether ActionScript, JavaScript or something else - thereby offering limitless possibly infinite amounts of code to somehow be controlled, and every possible factor conceived of. It’s a statement of insanity to honestly believe that such control and foresight is possible at any conceivable level.

Don’t turn the responsibility of security on the makers of the products: Irresponsible would be your host having a worm on their servers and not admitting to it, or you having a worm and not taking precautions or ensuring there’s no issue your end. Ignorance and denial are an issue… Adobe like all software companies do as much as they can to protect users by releasing bug fixes (etc) - though that takes time, but don’t turn them into the bad guys when the responsibility should be on those who cause the problems (the people who write malicious software) and the (lack of a better term) general ignorance of the end user who happily open attachments, don’t frequently scan for viruses and show common sense in how they use their PC’s (thereby perpetuating the cycle). End users put everyone at risk, not the software developers - I used to be a full time software developer and it makes me really angry when people act like the bugs are entirely our fault - I happily fix bugs in stuff I write, but you can’t expect me or people like me having some Godlike ability to know every eventuality for everything. I wrote a code snippet manager (commercial one) and that had probably less than 5,000 lines of code… yet after persistent testing, checking all my code regularly etc still resulted in 67 bugs since v1 which needed to be fixed or updated… some small, some very large. I’m NOT a bad coder, it’s just a fact of life that bugs occur. :slight_smile:

Alex, it would be helpful if you could enclose quotations of the specific parts of posts to which you are replying.

I understand that you might see this part of the world in black and white because you are a professional software developer. In fact, the software world is often mostly shades of gray (but not for good reasons).
Any good software development team should have a software QA procedure that will test the product far beyond just a single development programmer. A well-managed software development program will include most (if not all) of the following phases:

  1. Desk Check (programmer)

  2. Module Test (programmer with other development team members – this could be a single page from a multi-page Web site)

  3. Alpha Test (the entire development team, plus a number of users – the entire Web site would be tested on a variety of hardware and browser software under different conditions)

  4. Beta Test (a resonable selection from the user base with incentives to report problems, anomalies)

  5. General Availability Release (all users)

Note the degree of testing that should be performed before a software product is released to the public. Any shortcuts in these phases of development QA procedures are almost certain to result in problems being detected by end-users, usually in some sort of bad experience.

Those developers that follow these sorts of Software QA guidlines will still experience bugs that manage to escape into the General Release, but typically not nearly as many of them.

When Forbes Magazine and PCWorld tell us that Adobe software is now more vulnerable than Microsoft software, what are we expected to think about Adobe’s QA standards and procedures?

Sorry, Alex, I do not agree on this issue. I’m just glad that the people who make our airliners, bridges, dams, tall buildings, pharmaceuticals, surgical instruments, food products (except those from China), hospitals, power plants, etc., don’t feel the same way about quality. :slight_smile:

deesy

It’s a well established fact that Adobe and Microsoft (etc) all implement the testing methods you have stated (as do I) but the point is that no two computer systems are identical and it’s impossible to account for every possible end user scenario, no matter how many testers you have… especially on a programming project as large and dynamic is a website editing product (considering the factors it needs to account for). With such a level of dynamism in respect to the situations it has to perform under, any comparison with things like foodstuffs, buildings and electronics (which are generally produced either off a factory line from a base template and require no “case scenarios for every end user” ) or are produced for a certain place at a certain time under strict conditions (like a power plant or a dam - where such situations can be controlled and measured exactly as there’s no chance it’s going to have to operate differently for every person using it) are quite frankly absurd.

The case in point is that there has NEVER to my recollection been a single case in history where a software product (more complicated than 1,000 lines of code) has been able to operate entirely bug free on every single system for every single user under every single possible test condition. Any expectations you have on a software developers psychic abilities plus the expectations that mass testing will solve every bug or glitch is crazy, Microsoft gave away FREE beta test copies of Windows Vista for everyone to play with, report bugs, help the product become stable and what happened… bugs occurred. Even small pieces of software that have thousands, maybe millions of beta testers like open source products still end up with bugs post their launch (after all the found bugs have been ironed out). I can’t see how it’s a matter of “quality” at all, bugs are an expected eventuality for any software development project, you can’t therefore say that every piece of software in existence is a piece of crap because you happen to have encountered a bug with it (though in your case it wasn’t a bug, just a life-cycle issue).

The only way you can guarantee that a product will run bug free upon the final build would be to leave the product in a perpetual and permanent BETA cycle. So you can go ahead and talk about all the software development stages you like in an attempt to make it look like the issue could reasonably be removed (from the biased perspective of a non-programmer with no actual evidence to support that a bug free guarantee on a product is even possible) but what you are doing is perpetuating a pipe-dream. No programmer will ever have a stable enough environment to control every possible variable that may cause problems, and to assume so is just plain wrong… if you think such a thing is possible and don’t understand the problem that the likes of Adobe and Microsoft have in developing such complex solutions, I have a simple answer for you: Write one yourself (that does the job better and less buggier) then come back and we’ll happily accept that you’re right. :slight_smile:

It’s a well established fact that Adobe and Microsoft (etc) all implement the testing methods you have stated (as do I) but the point is that no two computer systems are identical and it’s impossible to account for every possible end user scenario, no matter how many testers you have… especially on a programming project as large and dynamic is a website editing product

Hmm. Actually, how do you know that it is a “well established fact”? Aren’t you really just making an assumption? How do you explain the fact that Microsoft software has, traditionally, been considered less stable than UNIX, Linux, Novell or MAC software? What about military software? You know … the stuff that controls high speed missiles, nuclear weapons, spacecraft, etc.? Do you suppose that military software is really buggy, and the way we find out is when we accidently wipe out a city? And, even though “no two computer systems are identical” (actually a lot of them are), aren’t there only a few standard architectures on which most modern computers are built? Aren’t they the Intel x86-series (IBM PC) and the Apple hardware. Aren’t almost all Linux boxes also able to run Windows and DOS?

… any comparison with things like foodstuffs, buildings and electronics (which are generally produced either off a factory line from a base template and require no “case scenarios for every end user” ) or are produced for a certain place at a certain time under strict conditions (like a power plant or a dam - where such situations can be controlled and measured exactly as there’s no chance it’s going to have to operate differently for every person using it) are quite frankly abs urd.

Well, we’re just going to have to agree to disagree on this point. Quality is quality. Good Quality Assurance (QA) programs have strikingly similar standards and procedures, regardless of the nature of the product. Have you, for example, ever looked into the ISO 9000 program?

The case in point is that there has NEVER to my recollection been a single case in history where a software product (more complicated than 1,000 lines of code) has been able to operate entirely bug free on every single system for every single user under every single possible test condition.

This assertion is simply impossible to prove or disprove. I have seen large, complex software systems that have been able to run for YEARS without interruption. The world would be a sorry place if humans were not able to produce computer programs that were sufficiently stable to be trustworthy. Novell Netware, UNIX, Mainframe Operating Systems, and military systems are some examples of such software.

Any expectations you have on a software developers psychic abilities plus the expectations that mass testing will solve every bug or glitch is crazy …

Do you really believe that a software developer must be psychic in order to take reasonable precautions when developing software, or that he/she should not make a good-faith effort to remove all foreseeable defects from his/her code? If that attitude is pervasive at places like Adobe, it would explain a lot! BTW, “mass testing” is not the same as “well organized” testing.

Microsoft gave away FREE beta test copies of Windows Vista for everyone to play with, report bugs, help the product become stable and what happened… bugs occurred.

Right! Microsoft did the same thing with Windows 95. In fact, they bragged about how they had more than 3 million “Beta Testers” worldwide. We all saw how that worked out! Sending out advance copies of your new software so that users might be able to “play with” it is not a real Beta testing program. Firstly, you have not, necessarily, placed the code into the hands of those users most qualified to really test it. Secondly, you have not limited the distribution to those who will assiduously observe and report anomalies back to the developer. Lastly, you have not provided a sufficient incentive to the testers to ensure that they will report their observations clearly and promptly. I have been a Microsoft Beta tester. I am not impressed with Microsoft’s Beta testing program. It might be good Marketing, but that doesn’t make it good QA!

Even small pieces of software that have thousands, maybe millions of beta testers like open source products still end up with bugs post their launch (after all the found bugs have been ironed out).

Well, of course. Most developers today are not willing to spend the moneys required to ensure a clean, stable release of their software (although some, obviously, do). Open Source developers are usually not being paid at all, so the quality of software testing is probably less organized than it might be if it took place within a single organization.

The only way you can guarantee that a product will run bug free upon the final build would be to leave the product in a perpetual and permanent BETA cycle.

That is an extreme statement that borders on the ridiculous. If it was true, then how is it that some developers are able to release and support highly stable software products?

So you can go ahead an d talk about all the software development stages you like in an attempt to make it look like the issue could reasonably be removed (from the biased perspective of a non-programmer [emphasis added] with no actual evidence to support that a bug free guarantee on a product is even possible) but what you are doing is perpetuating a pipe-dream.

Now you’re resorting to insults. I’ve been programming for more than forty years, and in a variety of languages. I have held positions of responsibility in Software QA, Software Development, Programming, Project Management, Network Management, IT Management and Hardware Development. You don’t have any knowledge at all about my background, yet you refer to me as a non-programmer. Unless, that is, you were referring to the other sources I mentioned in my post. But then, you don’t know anything at all about their backgrounds, either …

No programmer will ever have a stable enough environment to control every possible variable that may cause problems, and to assume so is just plain wrong… if you think such a thing is possible and don’t understand the problem that the likes of Adobe and Microsoft have in developing such complex solutions, I have a simple answer for you: Write one yourself (that does the job better and less buggier) then come back and we’ll happily accept that you’re right. :slight_smile:

That is an inane statement. If I don’t like my car because of manufacturing or design defects, then I should build my own car and see if I can do it better? Is that your position? Thank God the whole world doesn’t think like you. We would be drowning in defective goods and services, and nothing would ever improve.

deesy

deesy58, let me get this topic back on track, in respect to knowing whether it’s an established fact or not, Microsoft have been pretty open about the development process they undertake in respect to the development cycles they undertake in taking a product to market. It’s not an assumption to base the concept that they go through conventional testing methods like Alpha, Beta, Release Candidate (etc) when it’s clearly visible in previous products that such stage details have been made publicly visible. As someone who is a programmer I find it rather hard to believe that you compare the likes to Windows to military software or the kinds of embedded systems which will entirely run in a self-contained environment.

It’s time to end this apparently-pointless discussion, so if you choose to respond, you will have the last word.

Don’t you think it is a little naïve to believe that military systems are all “embedded systems which will entirely run in a self-contained environment”? Do you imagine that all military computers have the same number of CPUs, the same amount of RAM, the same number and type of peripheral equipment and I/O devices, the exact same number and combinations of applications programs? As somebody who worked for a software development contractor that was writing code for the US military, I am not speculating. So, whether you “find it rather hard to believe” or not is of no consequence because you are uninformed on this matter.

Such systems have a much lower “risk of issue” in the fact that the hardware, and the entire environment it runs in are built purely for that system alone and therefore it doesn’t have to undertake the compatibility spectrum which a piece of software or the Windows OS would require in respect to balancing it’s efforts across an endless combination of hardware.

I think I just addressed this issue. Military computer systems are a lot more diverse than you imagine. Also, the military uses hardware that you have never seen, never used, and of which you are probably not even aware. Somehow, they seem to be able to make it all work, and work reliably. I am not referring to just the US military, either. Russia, the UK, Australia, China, the EU and many other countries develop and maintain computer software that is robust and reliable.

Any such comparison is a straw-man argument, plain and simple. I would argue that Linux is AS buggy as Windows (as is mac), the only difference is that because the Windows audience are less technically savvy and cruise their machines without (generally) any knowledge of security, their machines are more likely to be compromised - therefore it’s a larger target and more likely to be “shot at”… don’t believe me on the matter of embedded systems VS PC stability or the inherent risk that Windows has to deal with, take it up with Steve Gibson, he reinforced these points a while back on Security Now.

According to information to be found at: http://en.wikipedia.org/wiki/Comparison_of_web_browsers
there has been a significant difference between various browsers in term of security vulnerabilities resulting from weak QA and testing procedures. Internet Explorer 6 topped the list with a total of 463 unpatched vulnerabilities identified since November 20, 2000. Perhaps surprisingly, Internet Explorer 8 is next with 43 identified vulnerabilities since January 14, 2009. Internet Explorer 7 was third with 27 vulnerabilities identified since August 15, 2006. In contrast, Safari had three identified since August 15, 2001, and Opera had none. Sea Monkey and Google Chrome also had none, while Mozilla FireFox has one identified on March 24 (four days ago), but we don’t know which versions of FireFox are being referred to – just that they are “current stable releases.” In any event, it is clear that not all browsers, like not all software programs, are created equal. Some are clearly more stable, and of higher quality, than others. It is also likely that some software developers are more responsive, and are more responsible, when it comes to software quality. This comparison of unpatched publicly known vulnerabilities in latest stable version browsers is based on vulnerabilities reports by SecurityFocus and Secunia.

In regards to my comments about you being a non-programmer, that was STRICTLY in the sense of the conversation as to the matter of how browser rendering engines operate and how Adobe’s Dreamweaver product is forced to rely on code which it has no control over, no reasonable ability to future-proof (due to the components being of a third party) and how your placing an unreasonable expectation for such a product (in the sense of a WYSIWYG design view window) in that respect.

You didn’t say that. Also, it seems clear that the Dreamweaver product does, indeed, use a rendering engine that Adobe might not totally control, but which it certainly chooses to incorporate into its product (see posts from others on this forum). If the CSS that Dreamweaver generates is sufficiently erroneous that more than one member of this forum can see it with the naked eye, then where is the software defect? I believe that it is in Dreamweaver, but a reasonable argument might persuade me that it is somewhere else.

Your comparisons between military systems and Linux and other products in that sense are ludicrous because none of those products have to perform their action based entirely on 3rd party components which evolve on a regular basis (nightly if you go by how often Webkit and Gecko are updated in the build cycles) to which the only measure of the potential future rendering are specifications which Adobe have no control over. It’s like saying that Linux must be held responsible if someone developing a product on C++ crashes because their meant to be running the executable code.

I don’t take your point because it isn’t clear what you are saying. Are you trying to say that a program written in C++ can’t be compiled and run on the Linux OS?

Since end-users don’t directly purchase and use Webkit and Gecko, would developers be wasting time and energy trying to keep up with changes in a rendering engine that might never become incorporated into a browser product before it is released to the public? Would this be similar to incorporating changes made to program libraries into one’s code before those changes were adopted as standards? Wouldn’t that be a bit risky?

Other people in this thread have pointed out on several occasions (which you didn’t reply to in the post) that if you were unhappy using Dreamweaver - due to it like all WYSIWYG editor ageing badly in respect to their inability to control how the browser model evolves, then you should try a different product… there are plenty of free solutions which do just as good a job. As for your coding experience, unless you are going to state here that you have produced a web browser rendering engine, you work for one of the browser producers, your a member of the W3C (who write the specs), you are a web professional (in the sense of understanding the code your writing - which as you use the design view may well not be the case), you really aren’t in a position to question the motives of the business and how it produces it’s software when you clearly don’t have all the facts about the situation. You can be as aggressive as you towards me but the simple fact is, my sympathy is with Ado be, I understand why they shouldn’t be held accountable for other peoples code (which is what the design window effectively results in when used on a web browser)… and I have little sympathy for anyone who are using comparatives in situations where they have no relevance. After all… Dreamweaver as a product IS bug free, you can use the code window all you like and there’s probably little to no impact in terms of rendering when you write the code… it’s just the system that’s dependant on outsourced components (which it cannot produce internally due to public usage of web browsers) that is affected.

Get a grip! This isn’t religion! This is computer software. Your emotional assertions make one believe that maybe you work for Adobe. Do you? “After all… Dreamweaver as a product IS bug free …” C’mon! You can’t REALLY believe that! Dreamweaver is bug free!?? I won’t even dignify that assertion with a response.

deesy

Ive got the book. I can see what you are trying to do . How did you go with the top links?

<snip/>

Wow, talk about a spectacular reversal of opinion. Throughout this thread I have stated to you time and again that when it comes to the rendering of websites, Dreamweaver does not use it’s own rendering engine. Yet here you are directly contradicting your own denial (which caused the counter of opinion between myself and yourself in the first place) by stating that after reading other peoples posts in the forum you suddenly noticed that what I said was factual. You also suddenly started making references to the other rendering engines and the effect that their future usage may affect to whether their supported by such products (you earlier stated that knowing the future was not related, even though future proofing your product to cope with future rendering engines provided from outsourced components clearly requires this otherwise their setting future standards, not following them). If Adobe don’t account for potential future events like a new standard not yet standardised, how can you or anyone else expect their product to support that standard when invoked (which of course will be required to display the website properly).

As this will be my last post on this thread, think about this. By asking Adobe to ensure that their product will correctly render your website when standards change, when renderers stop supporting current methods of displaying information, etc… they would need to ensure their product therefore supports those very things which have not yet been written and those changes which are made outside it’s life-cycle. Yet you still want to hold them responsible for not ensuring their (now elderly) product supports what were at the time, non-existing standards, rendering engines with very little support (in 2004, webkit and gecko didn’t have such a huge following) and still believe that they should be held responsible for code which they neither produced, nor had control over (but were forced to support as the browsers are what end users use). Sorry if you think I’m cynical (or some Adobe spy as you presumed)… but you can blame Adobe if you feel it’ll give you “warm fuzzies”, but I’ll say what I’ve been saying throughout this thread. Your expectations contradict what you’ve said previously so I deem them totally unrealistic of Dreamweaver. :slight_smile:

Ive got the book. I can see what you are trying to do . How did you go with the top links?

I moved the links down and to the left, at the margin.

My CSS code looks very much like that found on page #152. To the #controls ul, I added rules for font type and inline display.

To the #controls li, I added rules for text-align: right and margin: 0em. Otherwise, it isn’t very different from the book.

It displays beautifully, and correctly, in the design window. When rendered by FireFox 3.6 or Internet Explorer 8, it looks like crap!

deesy

I tried to attach an MS Word document that had screen shots of the offending images, but it apparently is not possible to keep them attached. For those of you who want to see how my CSS makes the images render, I guess a Site Moderator will have to release the attachments. I don’t know how else to show them to you.

deesy

Normally, attachments pending approval are still viewable in the post itself, and I see nothing in the post (normally it’s a little thumb saying “attachment pending”).

Though a Word document is nasty: code can be posted in the thread, even very long if in

 tags, and images like screenshots are best uploaded in an image format.  

http://stommepoes.nl/haslayoutmargins.png

or with [img] tags: 
[img]http://stommepoes.nl/haslayoutmargins.png[/img]

I mean, I would hope Open Office can open a Word document, but usually windows uses nasty proprietary charsets that don't play nice on my computer : )

This might also want to be opened as a new fresh thread: Issue In IE8, post code and screenshots of what's incorrect, and maybe a screenshot from a browser who shows it correctly, so we know what is wanted and what you're getting.  We can also then copy the code (the entire code, not just bits) and test in many other browsers as well (since sometimes that helps if the problem is weird).

There are no attachments waiting anywhere on Sitepoint as I have approved them all this morning. There wasn’t any for this thread and as Stomme said above you would see a message saying waiting approval.

Maybe the post timed out and you lost the attachment.

Might just be a glitch - I often lose posts when they time out. Have to keep an eye out and see if it happens again.