SitePoint Sponsor |
|
User Tag List
Results 26 to 50 of 63
-
Sep 13, 2003, 09:29 #26
- Join Date
- Apr 2003
- Location
- London
- Posts
- 2,423
- Mentioned
- 2 Post(s)
- Tagged
- 0 Thread(s)
Hi...
Originally Posted by firepages
That was included in "good design". In your example it would mean a tractor designer taking a look at sports car design in order to build a better tractor. There is nothing wrong with studying designs in other systems, it is actually a very good idea. Part of good design is that it gives you the possibility of trading issues against each other. I hardly thing anyone on this forum is going to implement J2EE in a single PHP script. Part of the skill in design is choosing the right tool and that skill is no less apparent in Java designs even though the resulting setup is different.
Unfortunate buried in these performance comparisons in this thread is a real argument. What OO gives you is enough flexibilty to not repeat yourself. This means that change is easier and overall development time is reduced. The development time is nearly always worse whilst you are cutting the first few classes and performance for small sites will be worse as well. As the site grows or more sites are created you get big wins.
But you lot all know this, the point is that you are balancing and trading some performance against the more expensive developer time as well.
Worse case: Your team use a java like PHP library with lot's of stuff you don't need (say you just happened to be familiar with the one you used with Java) and an XML (or Smarty) template system for a six month project. Say it runs five times slower, but you saved two months doing it this way and you expect to save another maintainence month while the system settles down to the final version. Cost of that three months is about $20000! For that cost you can load balance three servers and throw in the licenses for the Zend performance suite.
This is an extreme case. I have even massively exaggerated factors here in favour of under-design against over-design and this doesn't include later maintanence work, say changing to an external authorisation server, which benefits from the OO investement as well. I just cannot get the "gut the design for performance" argument to add up.
For anyone looking at web software design I would say, please, please have a look at the Java libraries and try out some of the ideas. It really is a fantastic resource.
yours, MarcusMarcus Baker
Testing: SimpleTest, Cgreen, Fakemail
Other: Phemto dependency injector
Books: PHP in Action, 97 things
-
Sep 13, 2003, 09:44 #27
- Join Date
- Jan 2003
- Posts
- 5,748
- Mentioned
- 0 Post(s)
- Tagged
- 0 Thread(s)
For anyone looking at web software design I would say, please, please have a look at the Java libraries and try out some of the ideas. It really is a fantastic resource.I'd advocate that it'd be fair enough for a project to run over slightly if it meant that the actual development was going to be better for it myself ?
Like, what is another 2-4 weeks huh ? Nothing if you get a better project development at the end of the day in regards to design and how easy it'd be to expand later.
-
Sep 13, 2003, 11:17 #28
- Join Date
- Jul 2003
- Location
- Mainz, Germany
- Posts
- 119
- Mentioned
- 0 Post(s)
- Tagged
- 0 Thread(s)
I use PHP for professional development and I agree with you on the benefits of a well designed framework/library when it comes to cutting down development time.
Two weeks ago I joined my current project which is supposed to go live next thursday. The project architect (who'd never heard of Martin Fowler or the Gang of Four before I mentioned them) had decided to make heavy use of PEAR classes and use Smarty, because one of the client's requirements was no PHP in the templates. The client is currently load testing the system on four load balanced, Zend boosted dual Xenon servers. The current setup can't handle more than a few hundred concurrent users without severe lag. Next thursday, the system is supposed to go live on 16 of those servers and will have to handle up to 20,000 concurrent users.
Maybe bringing in a few extra developers early on wouldn't have been such a bad idea after all.
On the other side, I'm on an open source project building (yet another) CMS. We want people to be able to run that on cheap hosted webspace, i.e. sharing a server with loads of other people, having lousy performance, a MySQL server that's not on localhost, no script accelleration whatsoever, etc.
We started out using Phrame and Eclipse and Smarty, but our first draft was almost as slow as Tiki when we ran it on sourceforge webspace. Now we're building our own framework, using AwsomeTemplateEngine and "defactoring" key parts of our system on purpose, because a "well factored" system of classes just doesn't pull it's weight in what we think is a typical PHP application.
-
Sep 15, 2003, 09:42 #29
- Join Date
- Aug 2002
- Location
- Ottawa, Ontario, Canada
- Posts
- 214
- Mentioned
- 0 Post(s)
- Tagged
- 0 Thread(s)
Originally Posted by lastcraft
Wireframes (in the context I use them) are a quick way of determining the flow of a website without actually generating code.
I have v3.99 of this:
http://sourceforge.net/projects/wireframetool/
Wireframing is not exclusive to Fusebox. It is applicable to anyone who wants to determine flow before they code. In Fusebox, this helps to make decisions on how Fuseactions and Circuits can be grouped.
Cheers,
Keith.
-
Sep 15, 2003, 11:08 #30
- Join Date
- Apr 2003
- Location
- London
- Posts
- 2,423
- Mentioned
- 2 Post(s)
- Tagged
- 0 Thread(s)
Hi...
Cheers for the link, but I don't have a copy of Cold Fusion. Could you post a screenshot?
yours, MarcusMarcus Baker
Testing: SimpleTest, Cgreen, Fakemail
Other: Phemto dependency injector
Books: PHP in Action, 97 things
-
Sep 15, 2003, 11:18 #31
- Join Date
- Jan 2003
- Posts
- 5,748
- Mentioned
- 0 Post(s)
- Tagged
- 0 Thread(s)
... but I don't have a copy of Cold Fusion ...IMO it's basically an over rated Mark Up language passing it's self off as a server-side technology
Plus it's expensive
-
Sep 15, 2003, 14:16 #32
- Join Date
- Aug 2002
- Location
- Ottawa, Ontario, Canada
- Posts
- 214
- Mentioned
- 0 Post(s)
- Tagged
- 0 Thread(s)
Originally Posted by lastcraft
Cheers,
Keith.
-
Sep 15, 2003, 14:22 #33
- Join Date
- Aug 2003
- Location
- Watford, UK
- Posts
- 62
- Mentioned
- 0 Post(s)
- Tagged
- 0 Thread(s)
This is getting way OT, but I've found wireframes really, really useful. They allow us and our clients to really rapidly prototype - once it's done and signed off we both know exactly what they're going to get. Splitting the functionality from the design is very effective in getting the client to think about how their app should work.
(the we is my company, not the royal (or zeldman) we)
We started off producing wireframes similar to those on the IAWiki in Dia, but soon switched to using plain unstyled (but well structured) XHTML. This allows the client to see the app 'working' before we've built anything and avoids us building the wrong thing (even if it was actually The Right Thing...)
I've just had a quick look at the Fusebox stuff and their FLiP methodology seems very similar to what we've arrived at:
http://www.fusebox.org/index.cfm?&fu...hodology.steps
This article gives some more coherent reasoning (than mine, not fusebox) for using HTML:
http://www.boxesandarrows.com/archiv...nd_no_pain.php
It suggests using dreamweaver or similar, and I guess that could make the whole thing even faster. I'm slightly wary of WYSIWYG tools myself (emacs + psgml-mode = v. fast), and there's maybe an argument that using dreamweaver could lead to concentrating too much on the look of the app. IMO the screenshots in the above article illustrate this perfectly - I prefer the client to have something that everyone can agree is ugly - that way they can forget about the looks and think about the functionality. The wireframes should be about how it works, not how it looks. But however you markup, I've found it a very worthwhile practice for web apps.
We contract out graphic design - the signed off wireframes can be passed on to a designer to work on while we're busy developing the app. The wireframes, along with some delivery guidelines, make life easier for us and for the designer.
We use CSS for layout these days, so our plain (but well structured) XHTML wireframes can be transformed at the drop of a hat when we receive the final design - another bonus in that we don't have to redo the markup. We provide a hook to a stylesheet in the wireframes so that our designers can work directly on the CSS if they want, but so far we haven't had much uptake on this - hopefully this will change soonish, making our cossetted lives even easier.
Anyway, thats enough of my rambling - wireframes are good like cheese is good!
Cheers,
Jon
-
Sep 16, 2003, 15:48 #34
- Join Date
- Sep 2002
- Location
- Clearwater, Florida
- Posts
- 69
- Mentioned
- 0 Post(s)
- Tagged
- 0 Thread(s)
Originally Posted by Selkirk
Not to mention that I have the same last name as Brian Cox.
As far as talking about CRC cards, isn't that normally associated with Responsibility Driven Design? In my opinion, the CRC card isn't a needed part of RDD, but it helps and is often the tool used to teach the RDD idea to students. That idea being that various portions (objects or modules) of a program can be identified up front based on responsibility. I think this is a key step in helping reduce bloat! Many times, you come across objects (or should I say classes?) that contain methods that aren't likely to be used. That are seldom used if at all. As a result, you get the overhead of parsing larger objects without the justification of using them in totality. This is the reason that I feel that the Eclipse libraries make so much more sense for what most people are doing on their web sites. It's rather light weight compared to something like ADOdb.
Now I'm not putting ADOdb down. Not at all. I think John Lim is one of the strongest PHP guys out there! It's just that ADOdb is more than most people creating sites will ever need.
On the other hand, if you do need the functionality it provides, it's a much better performing option than PEAR, Metabase, and PHPLib.
Cheers,
BDKRIf you're not on the gas, you're off the gas!
-
Sep 16, 2003, 16:00 #35
- Join Date
- Sep 2002
- Location
- Clearwater, Florida
- Posts
- 69
- Mentioned
- 0 Post(s)
- Tagged
- 0 Thread(s)
Originally Posted by ZangBunny
One thing real fast: Are you really 'defactoring' parts of the system, or just stepping away from a more pure OO implementation? What exactly do you mean by 'defactoring'?
Also, have you read some of the stuff that John Lim has posted concerning the use of OO in PHP? Check out the link below.
http://php.weblogs.com/discuss/msgReader$537?mode=day
The one thing that is strange to me is that the number of methods in a class doesn't seem to have an affect on performance. Only the length of time needed to parse. Hmmmm????
Cheers,
BDKRIf you're not on the gas, you're off the gas!
-
Sep 17, 2003, 00:36 #36
- Join Date
- Aug 2003
- Location
- Australia
- Posts
- 155
- Mentioned
- 0 Post(s)
- Tagged
- 0 Thread(s)
Originally Posted by BDKR
Sometimes good design comes at the expense of good performance, but then everything is a trade-off. It's up to you to choose where you do the trading for your circumstances.
I think a good approach is to strive (within reason) for architectural perfection, and then denormalise the code afterwards. This is common practise with database design. Sometimes tables are denormalised for efficiency, but only after the ideal design first came about.
This isn't always the best approach though, particularly when the foundation of the design is based on a performance bottleneck. So it works both ways. You just have to keep the performance issues in the back of your mind when striving for that perfect architecture. This is the defference between an experienced developer and an amature.
The other question you have to ask yourself is... "can your client afford the increased cost of developer time for badly designed code?" Also what about the hidden costs dealing with consequences of bugs the the bad architecture is responsible for? - if indeed there was a trade-off between good design and performance.
The design phase is the smallest part of the total application lifetime (if the thing is of any use at all). Maintanence is a larger slice of the total cost of ownership (TCO).
-
Sep 17, 2003, 00:53 #37
- Join Date
- Aug 2003
- Location
- Australia
- Posts
- 155
- Mentioned
- 0 Post(s)
- Tagged
- 0 Thread(s)
Originally Posted by ZangBunny
-
Sep 17, 2003, 10:43 #38
- Join Date
- Apr 2003
- Location
- London
- Posts
- 2,423
- Mentioned
- 2 Post(s)
- Tagged
- 0 Thread(s)
Hi.
Originally Posted by tezza
.
"Defactoring" is an excellent term and I think we are groping towards something very interesting here (in the way of the C2 Wiki). I am going to hijack it for a second and define it as "selectively reintroducing some repetition for performance". This brings it in line with denormalisation.
The cost of this is clear. Any later work will involve making multiple copies of the same change in the code. Now this is really bad news and so has to weighed really carefully against other solutions. This is also the reason such defactorings are left to the last minute, as the author certainly doesn't want to cripple the development process at such an early stage. Unfortunately the project had better be stable or later maintainers will not be so lucky.
This is not the same as removing unnecessary design. Stripping away the "grand scheme" in favour of something simpler is refactoring up until it starts to hurt in later development time. The point of refactoring is to cut later costs by investing time in the software design as you go. I cannot think of a sane reason to defactor except for unskilled developers down the line or performance (any more suggestions?).
The performance type of defactoring can be accomplished with code generation. If you are writing twenty classes that are very similar, put the differences into into your own format (say an XML one) and use a tool to generate the actual class code from templates. We do this with web pages after all. You can either do this within the package or ship the package prebuilt.
Unskilled developers, or extenders that will not have time to learn the system, need to be dealt with slightly more carefully. Either use the generation gap pattern where the generated classes can be extended and then reintroduced into the system, or give them a base class and let them add extensions as "plug-ins". Both are tricky to get right and have downsides. You will have to write tutorials, at least.
Please look away now...
I haven't yet said anything about the "pure OO" argument.
PHP is not pure OO because not everything is an object. This forces you to make design choices about smaller objects in the system. I don't consider this an advantage, but the lack of strong typing mitigates against this a lot. For example for collections (stacks, queues, lists) you can just use a hash. The cases where you want more (trees, lazy loading, caching) are less common and you usually know about them ahead of time. If there is any doubt at all though, I use an object.
The top level activity of PHP is a script selected by Apache. It even starts in HTML mode, so at least the top level of the app will not be OO unless you only see a framework. Either way that bit is not pure OO. What else?
What OO that PHP has is a superset of procedural code and so it is always possible to migrate procedural code into OO code. Should you go the other way?
Easy refactoring requires encapsulation. Avoiding duplication requires polymorphism. Going back to procedural code robs you of the ability of coming up with any design, never mind a well factored high performance one. If someone suggests "unfactoring" in this way they must obviously delight in constant challenge. Perhaps they should saw a leg off and poke one of their eyes out as well.
Is performance worth it?
From the client side no. Suppose you double the speed of your script just with minor code changes. I am not talking real stuff here, such as caching data or keeping file handling down as that would involve design changes, I am just taking about code smithing. Now you would have to be a pretty determined hot shot to achieve this accross an entire project so well done. I now request a page from a well connected country, say Japan. I have 30 to 300msec for the DNS look-up, 2 secs to start getting the page and a 2 sec wait for the first byte to come back. I find that the page needs a stylesheet and so another 4 secs passes before I see anything meaningful. How much did you save? Nothing. All that happened was that the core script was sent out maybe 20msec faster whilst the styles and images were loading (you streamed the page whilst processing it because you were worried about performance, right?).
OK, what about server load? When was the last time we actually measured it across all components in the system? Must be important right? What percentage of the load is Apache? What part of the Apache load was PHP execution time? How much reading the PHP file itself? Did we manage to save a 10% load? Or did we skip that measurement and just buy a bigger server?
OO languages are a design enabler in the same way as declaritive languages, functional languages and domain languages are. Dropping back to procedural coding to chase performance phantoms is pretty lame. We've moved on.
yours,MarcusMarcus Baker
Testing: SimpleTest, Cgreen, Fakemail
Other: Phemto dependency injector
Books: PHP in Action, 97 things
-
Sep 17, 2003, 11:59 #39
- Join Date
- Jul 2003
- Location
- Mainz, Germany
- Posts
- 119
- Mentioned
- 0 Post(s)
- Tagged
- 0 Thread(s)
When I say "defactoring" I don't mean anything as drastic as dropping back to using procedural programming. I mean the littler defactorings such as "Replace Query With Temp" (especially if the query is against a database or a SOAP server), or "Replace Data Object With Array" if it is only dumb data, or simply using the result of a mysql_query() to mysql_fetch_array() instead of constucting a collection of data objects that I can walk through with an iterator.
In short, I use common PHP idioms where I find them more appropriate than clumsily trying to duplicate Java idioms in PHP. Considering some other PHP/OOP idiosyncracies, I find it convenient to think ofPHP Code:$result = mysql_query($sql);
while ($row = mysql_fetch_array($result)) {
...
PHP Code:$my_collection =& new ObjectCollection($sql);
$my_iterator =& new ObjectCollectionIterator($my_collection);
while ($my_iterator->isValid()) {
$my_object =& $my_iterator->getCurrent();
$my_iterator->next();
...
}
In his documentation for the Eclipse library, Vincent suggests the following idiom:PHP Code:for ($it->reset(); $it->isValid(); $it->next())
{
$object =& $it->getCurrent();
doSomethingWith($object);
}
(As usual with my posts, this one lacks a concise conclusion or even a climax at the end. I do this just so you know it's really me posting, not just some impostor.)
-
Sep 17, 2003, 12:00 #40
- Join Date
- Jan 2003
- Posts
- 5,748
- Mentioned
- 0 Post(s)
- Tagged
- 0 Thread(s)
Dropping back to procedural coding to chase performance phantoms is pretty lame. We've moved on.
Object Oriented Programming Rocks folks...
-
Sep 17, 2003, 13:06 #41
- Join Date
- Aug 2003
- Location
- Belgium
- Posts
- 189
- Mentioned
- 0 Post(s)
- Tagged
- 0 Thread(s)
Anyone know how much faster Object-Oriented scripts will run on PHP 5? any benchmarks available?
-
Sep 17, 2003, 19:19 #42
- Join Date
- Aug 2003
- Location
- Australia
- Posts
- 155
- Mentioned
- 0 Post(s)
- Tagged
- 0 Thread(s)
Originally Posted by lastcraft
life is good.
Originally Posted by lastcraft
-
Sep 17, 2003, 19:22 #43
- Join Date
- Sep 2002
- Location
- Clearwater, Florida
- Posts
- 69
- Mentioned
- 0 Post(s)
- Tagged
- 0 Thread(s)
Well, Since I know a jab when I see one, here goes...
Originally Posted by lastcraft
Originally Posted by lastcraft
Originally Posted by lastcraft
That said, consider the case of ADOdb. John Lim a number of months back did a comparison of ADOdb against Metabase, PHPlib, and Pear (forgive me if I failed to mention any of the contenders). ADOdb whipped all of them soundly. In the process of this, or should I say as a result of this, Manuel Lemos (the guy responsible for Metabase) screamed, kicked, and cried foul. However, John explained it rather well.
[blockquote]
When i first benchmarked ADODB versus native MySQL a year ago, I was horrified to find that my code was running 60% slower. So I worked hard to tune it until it got better, closer to 30% overhead, which is what it is today. Benchmarks are just another tuning technique if you have the right attitude.
I would expect the PEAR guys to be looking at their code now, trying to see why their code has a 150% overhead. They haven't attacked me personally. Tomas V.V.Cox seems to be nicer than either of us. I really don't know what else to say because even if egos are at stake there's no reason to be so rude.
Bye, John
[/blockquote]
Perhaps you should read it. 150% overhead for PEAR! I think that would matter to a client!
http://php.weblogs.com/discuss/msgReader$1112
Now he doesn't explain what he did to "tune" it, but in the link that I posted in an earlier post gives me a good idea of some of the things he may have done.
http://php.weblogs.com/discuss/msgReader$537?mode=day
But the most important thing I'm getting at here is that this wasn't about going procedural. The truth of the matter is that we hadn't yet determined what was meant by 'defactoring'. John managed to gain some good results from his objects, yet they remained objects!
But back to the "pure OO" squawking, one of the things he said in the second link I posted above was...
[blockquote]
3. using set/get methods seems like a bad idea for frequently used properties.
[/blockquote]
Hardcore OO battle zealots will crawl up your chuff behind this kind of thinking. Now please note, this is the kind of dedication to paradigm that I was speaking of. The fact that people will get so upset is proof that some fundementalist toes have been stomped.
Now before you go into why not to do this, don't worry about it as I allready know it and don't need to be flamed to hear it again.
Originally Posted by lastcraft
Originally Posted by lastcraft
But even still, that 30% for ADOdb was no phantom either was it? And for many web sites or other applications, the database becomes the bottleneck well before anything. Ask Slashdot about 9/11. Yahoo can tell you the same thing.
In otherwords, I am more than willing to "go procedural" to gain that 30%. Everything I can do as a matter of fact before going to C or just "throwing more hardware at it" wich interestingly enough is something I hear alot from dedicated OOP'rs.
Ultimately, the real question is "When" to do anything in particular. I'm about to start on an inventory tracking and control system for the warehouse of the company I'm temping for right now (yeah, I talked myself into additional work :-^). It's not likely that this will ever be a particularly heavy loaded system so why should I worry about it too much? I'm going to use Eclipse. All objects!
Cheers,
BDKRIf you're not on the gas, you're off the gas!
-
Sep 17, 2003, 19:33 #44
- Join Date
- Aug 2003
- Location
- Australia
- Posts
- 155
- Mentioned
- 0 Post(s)
- Tagged
- 0 Thread(s)
Originally Posted by BDKR
Anyone else notice any Jabs?
Thought that post was jabbless.
-
Sep 17, 2003, 20:18 #45
- Join Date
- Sep 2002
- Location
- Clearwater, Florida
- Posts
- 69
- Mentioned
- 0 Post(s)
- Tagged
- 0 Thread(s)
Originally Posted by tezza
"absorb what is useful, reject what is useless, add what is specifically your own."
Cheers,
BDKRIf you're not on the gas, you're off the gas!
-
Sep 17, 2003, 20:20 #46
- Join Date
- Nov 2002
- Posts
- 841
- Mentioned
- 0 Post(s)
- Tagged
- 0 Thread(s)
Originally Posted by BDKR
In my simple test, adodb's additional speed in the inner loop seemed to be overwhelmed by the additional amount of parsing and setup time required. Of course, this might be different when using a PHP accelerator. It might also be different if you were doing more complex processing where you could amortize the overhead over more db work.
However, by my definition of bloat at the start of this thread, adodb would seem to be the more bloated library. This really surprised me. I expected adodb to be faster than PEAR before I actually did the tests.
-
Sep 17, 2003, 20:57 #47
- Join Date
- Sep 2002
- Location
- Clearwater, Florida
- Posts
- 69
- Mentioned
- 0 Post(s)
- Tagged
- 0 Thread(s)
Originally Posted by Selkirk
Here is what I said.
Originally Posted by BDKR
Cheers,
BDKRIf you're not on the gas, you're off the gas!
-
Sep 17, 2003, 21:37 #48
- Join Date
- Apr 2003
- Location
- London
- Posts
- 2,423
- Mentioned
- 2 Post(s)
- Tagged
- 0 Thread(s)
Hi...
Originally Posted by BDKR
Originally Posted by BDKR
. Part of my point was that things were OO by degrees anyway and that none of us were really pushing "pure" OO. We are all realists and differ by some small degree in how we mix our paradigms.
Originally Posted by BDKR
. You have exposed the internals and sidestepped your interface. This is really bad news and will cause problems anytime you change the implementation of the original. We would both call this defactoring. I would also call it bad coding. This has nothing to do with OO, but everything to do with anyone who will ever work with your code. There is nothing philosophical about this, I simply have a lot of grim experience working with code that does this.
Originally Posted by BDKR
Originally Posted by BDKR
Would 150% DB performance matter? No. Clients usually want to know the throughput versus the cost. That is a question of scalability. If I were to tell a client that the translation layer added a 150% overhead taking to the DB, but the system as a whole was comfortably within specs and budget, they wouldn't bat an eyelid. If the PEAR libraries save money in other ways compared with AdoDB then likely I would go with PEAR. Again, developer time is far more precious. That is not to say I would ignore performance.
Originally Posted by BDKR
Such comments show a poor understanding of the software development process as a whole if he is seriously suggesting inspecting the code and replacing accessor access with direct access for every guessed choke spot. Even if profiling showed that there were a very few points that this occoured I would still look at alternatives first. Also that code would be facaded off and forked in it's own CVS branch cul'de'sac. If the code changes, better to reoptimise the new version than waste development time working with the mangled one.
Originally Posted by BDKR
!
Originally Posted by BDKR
Originally Posted by BDKR
Is that acceptable (150%) for somebody writing a library such as PEAR or ADO DB? Clearly the release version of the library should be optimised. He gained 23% from all of this (very little I suspect from PHP code changes). OK pat on the head, but not for me a major enough breakthrough to cause me to dash out and rewrite my code. More interesting is the option of the C module that he offers. That was a design change (a compiled plug-in) not a code smithing operation and consequently far more effective.
Originally Posted by BDKR
If your colleague come up to you and says that you need to handle a ten times increase in traffic are you really going to start by hand optimising every line of PHP code? Hardly. You are going to look at where the bottlenecks are and change that part of the design. Once you have done that and achieved the ten fold increase would you go back and hand optimise the PHP code. Hardly.
Originally Posted by BDKR
Now what worries me in all of this is that people reading may think I am saying ignore optimisations. I am most certainly not. There are big gains to be had from measuring a system and then choosing parts to be optimised. The big gains, though, are usually structural. Caching and lazy loading mainly, precalculating values and compiling can also have dramatic effects. By dramatic I mean orders of magnitude.
Now we have had this discussion before...
http://www.devnetwork.net/forums/vie...er=asc&start=0
...so this time I'll choose a newer example (from last month).
We have a database system that handles full text searches (confidentiality prevents me from naming it). We wanted to search for a particular form of expressions, in particular lowercase only queries of multiple words. The query language of the database engine allows various regular expression matching tricks which we were using with test versions of our database. The test database is small (8 words) because we want the unit tests fast. We have developed the system without any performance tests in the full knowledge that once the design is right we can tackle this later.
That time is now. A single run eats 100% of a two processor box for 30+ seconds. Out come the query planners, benchmarking tools and the PHP time() function.
Our first act is to batch up the queries (several thousand). That took the load down, but actually made the time worse. Still a factor of two overall though for not much work. We had to change only one class interface although I have to say that a certain amount of luck was involved here. We were already batching queries up at a higher level.
The next one was more crippling. Regexes were just too slow (even in the specialist engine) as it was defeating the index. We found that case insensitive searches were fine and so we created an extra copy of the data with case encoded as special characters. This added some hours to the offline processing as we were effectively precalculating the case matching. It also added a few gig of data to the server requirements. However it did the trick. That query now takes a couple of seconds, a forty five fold increase. Our development cost was an extra Perl script. No changes were needed to the main application at all for this as it was all encapsulated behind one class. Total cost two days, total gain 9000%, amount of code tweaking zero.
Good design enables performance tuning. OO enables good design.
Anyway, thanks for coming back at me (you get the rep). you could call the optisation above "defactoring" also, although it was a change of design really given a new requirement. I have always thought of that as refactoring, but I'll go with the majority.
There is more meat in this argument yet.
yours, MarcusMarcus Baker
Testing: SimpleTest, Cgreen, Fakemail
Other: Phemto dependency injector
Books: PHP in Action, 97 things
-
Sep 17, 2003, 21:40 #49
- Join Date
- Sep 2002
- Location
- Clearwater, Florida
- Posts
- 69
- Mentioned
- 0 Post(s)
- Tagged
- 0 Thread(s)
Originally Posted by tezza
Keeping that in mind, what do you make of this quote from the link you posted?
Ruby is a complete, full, pure object oriented language: OOL.
?
Here is another quote for you.
Working in an object-oriented language (that is, one that supprots inheritance, message passing, and classes) is neither a necessary nor sufficient condition for doing object-oriented programming. As we emphasized in Chapter 1, the most important aspect of OOP is a design techinque driven by the determination of deleagion of responsibilities. This technique has been called responsibility driven design [Wirfs-Brock 1989b, Wirfs-Brock 1990].
Think about that statement. It should lay to rest the idea that good design can only happen by using an OO language, pure or hybrid, and in particular, by using objects. OOP facilitates good design. That's it! It doesn't garauntee it nor does it somehow guide you into it automagically.
As an example, here is a class that some poor Joe was asking for help with on another message board.
PHP Code:class dataset
{
var $sets = array();
function pros()
{
include("dbs.php");
$lk = mysql_connect($serv,$user,$pass) or die(mysql_error());
mysql_select_db($dbname[0],$lk);
$qur = "select * from pollqa order by id desc limit 0,1";
$res = mysql_query($qur) or die ( "Error display \n" .mysql_error());
while ($row = mysql_fetch_assoc($res))
{
$this->sets[0] = $row['qs'];
$this->sets[1] = $row['ans1'];
$this->sets[2] = $row['ans2'];
$this->sets[3] = $row['ans3'];
$this->sets[4] = $row['ans4'];
$this->sets[5] = $row['ans5'];
$this->sets[6] = $row['ans6'];
}
return $this->sets;
}
}
In short...
1)The idea that good design requires the use of objects is simply wrong.
2) The idea that an OO approach or design requires the use of objects is simply wrong.
Cheers,
BDKRIf you're not on the gas, you're off the gas!
-
Sep 17, 2003, 22:09 #50
- Join Date
- Sep 2002
- Location
- Clearwater, Florida
- Posts
- 69
- Mentioned
- 0 Post(s)
- Tagged
- 0 Thread(s)
Well, since it's past my bed time and I have all kinds of other work to do, I'm going to let this go for the most part. I will bounce off one thing (or is it two) thing(s).
Originally Posted by lastcraft
As for...
Originally Posted by lastcraft
Thank you and good night,
BDKRIf you're not on the gas, you're off the gas!
Bookmarks