You might want to keep a closer eye on it as it supports all of the ADOdb date functions, some of the error classes, sessions as well as the xmlschema.Roja wrote:I still keep a close eye on adodb-lite however.. maybe it will one day be attractive enough to switch to it.
OOP....!
Moderator: General Moderators
- AKA Panama Jack
- Forum Regular
- Posts: 878
- Joined: Mon Nov 14, 2005 4:21 pm
Fair enough, I didn't notice the release 5 days agoAKA Panama Jack wrote:You might want to keep a closer eye on it as it supports all of the ADOdb date functions, some of the error classes, sessions as well as the xmlschema.Roja wrote:I still keep a close eye on adodb-lite however.. maybe it will one day be attractive enough to switch to it.
However, I have had some problems and found more than a few things it doesn't seem to support that I am using. I've opened a new thread since troubleshooting adodb-lite != OOP v. Procedural.
- Maugrim_The_Reaper
- DevNet Master
- Posts: 2704
- Joined: Tue Nov 02, 2004 5:43 am
- Location: Ireland
Isn't that a design issue? Well, it crosses parsing too I see now. Just to make the point that a fair design would use lean focused classes rather than giant swiss army knife types... Maybe we're just tackling the same thing from two side - which often end up as the same process...If you have a series of small classes then creating the instances for those modules is fairly quick. That's one of the reasons ADOdb Lite is faster.
Wouldn't that be funny - arguing with other when we agree on similar practices...hehehe
That's really all that counts then isn't it. That's my main thrust anyways. Choose to use what you need and leave the rest at home.Maugrim_The_Reaper wrote:Guess we differ there then, I don't use all the features or if I need some select few (Perf logs spring to mind of course) I can just switch in ADOdb for a production run.
Not that I have a production run... QS is still embryonic...
All too true - however the Lite version does meet all my requirements - I don't require the additional features of ADOdb.BDKR wrote:That sounds kinda Golden Hammerish if you ask me. In this case a "smaller is better" mantra could lead to more code in the long run as you try to make up for the lack in the library you chose to start with initially if you faile to accurately identify requiements.
It certainly seemed like it. Honestly, the sentence was kinda challenged (sorryMaugrim_The_Reaper wrote:I didn't make that contention, did I? Posting quickly here so not checking thoroughly. Was making the point (I think) that parsing is not always the overriding factor determining speed. There are dozens of other root causes, and design is often a culprit. A bad design (whether OOP or procedural) can lead to adjustments later on that ruin what optimisation you might have been building in as you coded.BDKR wrote:I do not agree with your contetion that parsing is of little concern.
The "OOP/Procedural parsing" part really worked me! I ultimately decided that you must mean "OOP/Procedural, parsing". Notice I added the coma? We may want to consider how you define parsing in this case.Maugrim_The_Reaper wrote: However speed is far more likely to be influenced by design, than simple OOP/Procedural parsing differences.
Parsing really is a black and white affair. The more code there is to parse, the longer it takes. It's that simple. Of course, some words or statements may require more work on the part of the lexer/parser, but for the most part it's going to take time to chew through that code in the parser stage.Maugrim_The_Reaper wrote: Parsing is a concern - but its one I rarely see in simple black and white.
Yes, it is a design issue. But be careful here that you don't equate the design issue with parsing concerns too closely.Maugrim_The_Reaper wrote:Isn't that a design issue? Well, it crosses parsing too I see now. Just to make the point that a fair design would use lean focused classes rather than giant swiss army knife types... Maybe we're just tackling the same thing from two side - which often end up as the same process...If you have a series of small classes then creating the instances for those modules is fairly quick. That's one of the reasons ADOdb Lite is faster.
Wouldn't that be funny - arguing with other when we agree on similar practices...hehehe
On the design side, the primary concern is normally how it will affect the understanding of, management of, and maintenance/extension of the object over time.
On the parsing side, it actually depends more on the language in question as all scripting languages may (and probably do) differ to some degree in how they deal with parsing their code. For compiled languages, this stage is of no concern at all as the parsing overhead is taken care of once at the compile stage.
Cheers
I suspect you arrived at a conclusion about me based on just the comments above. If anybody on this board has been called or considered a code-smith in the past, I have been. Of course they use it in that derogatory tone while the "pre-mature optimization" song is rattling along in their heads. Of course, I've also got the Rodney Dangerfield treatment as I've chosen to not go OOP in the past for performance reasons. Of course they then automatically say that my design is faulty or quesitonable without even knowing the particulars of the requirements.AKA Panama Jack wrote:This is one of the main problems I have with most programmers today.BDKR wrote:I do not agree with your contetion that parsing is of little concern. Did you look at the benchmarks on Panama Jack's site? Notice the increase in speed for Adodb when an accelerator was used? The percentage of increase is greater then 3 fold! Parsing makes a huge difference.
Parsing can be a huge and costly operation. There is no way around that fact. Just use an accellerator and be done with it.We have the memory and accellerators so who cares about how fast the code is right now. Throw it together and if we need speed make sure the server is using some kind of PHP accellerator. If a programmer is trained to create small and efficient code from the start then they can create the code just as fast if not faster than someone who just goes for what works at the time. Most programmers that are churned out from most courses are not trained in making the code tight and efficient but they are definately trained on making it look PRETTY. That gets my panties in a bunch.
You can make the code easy to follow and well laid out while making it tight and fast.
Not knowing your experiences in development, I'll guess based on what you've done and said that you had to deal with concerns that relate to performance. Well, that's the case for me too. In the best example, I spent 3 years designing, building, and maintaining a fault tolerant cluster that saw dead serious levels of load and brutal spikes that normally lasted about 15 minutes at each occurrence. It was the nature of the business we were in. In those spikes, we would often see over a million queries in a 5 minute time span (and to think that the pundits claimed MySQL couldn't do it! LOL).
We also HAD to be mindfull of the performance of our code as getting hardware in the Third World could be problematic. And we weren't running crap either. The database servers were Supermicro Pedestals with Serverworks dual proc boards. We also ran Ultra 160 controllers which were the cream of the crop at the time with Seagate SCSI drives. That stuff was hard to find so we allways ordered from the states, which meant we risked the possibility of having items stolen in the mail/shipping process (no joke). Being that we were also a cash starved startup at times didn't help matters either.
In other words, we couldn't just "throw hardware at it" as I've heared a lot of Jack Boot wearing application developers state. That may be just fine (and needed when you get into that object-relational stuff) but it serves to prove that their experience is limited to application development with no experience in mission critical stuff.
So, to put my comment in context, I feel that one has no room to complain about the up front parsing hit if their requirements are for a library means that it's big. If that's the case, you live with it and one of the ways to do that is to use an accellerator.
Now, if you've thrown in the kitchen sink or used a lib with the kitchen sink in it without giving much thought to your needs, then don't complain to me about it. But you too had better use an accellerator in this case.
Yes and no. Let's continue the analogy a little and extend it to the space required to store them. At my house, I have a one car garage. What that means is that while my car was being turbocharged and parked in there, the amount of space was severly limited. That meant being thoughtful of where I place/stored all of my tools so that I still had room to work. Often times I had to move something out of the way while working on a certain part of the car. There often were times when I needed to stop work and just clean up before things got too out of hand. Like putting away tools that weren't being used (sound like garbage collection?) as an example.AKA Panama Jack wrote: Well, it's great to have all of those extra tools even if you don't currently need them.
The analogy of course is to memory allocation. There is an overhead to memory management that the engine suffers to some degree depending upon various things, including the instantiation of large objects. If you have limited resources (space) and are at least moderately loaded, you may want to consider the size of the tool set no?
In this case, I want a memory upgrade in the form of a shed in the back yard where I can put non-car related tools.
This is the the interesting thing in comparison to some other languages. Objective C is an example. There is not a slow down in method look up as the number of methods increase inside an object.AKA Panama Jack wrote: And having those extra tools won't slow you down unless you are silly and try to use them all at the same time.
Oh well.... back to work.
- Buddha443556
- Forum Regular
- Posts: 873
- Joined: Fri Mar 19, 2004 1:51 pm
The processor is only one component involved in the parsing PHP and the processor is the fastest component. Disk operation are the slowest as the PHP files must be read before the scripts are parsed. Opcode caches eliminate the disk operations involved in parsing PHP which is certainly more significant than that of the compiling by the processor. A difference of milliseconds compared to microsecond.BDKR wrote:Parsing really is a black and white affair. The more code there is to parse, the longer it takes. It's that simple. Of course, some words or statements may require more work on the part of the lexer/parser, but for the most part it's going to take time to chew through that code in the parser stage.
I think it's significant that without using an opcode cache ADOdb-Lite has been able to achieve almost the same optimization (86% of ADOdb With Accelerator) that an opcode cache offers. IMHO this has more to do with the reduction of the memory footprint (by 2/3) than the processing of the PHP. If nothing else such an achievement definitely deserves praise ... however it was achieved.
Err..... when was the processor mentioned?Buddha443556 wrote: The processor is only one component involved in the parsing PHP and the processor is the fastest component. Disk operation are the slowest as the PHP files must be read before the scripts are parsed. Opcode caches eliminate the disk operations involved in parsing PHP which is certainly more significant than that of the compiling by the processor. A difference of milliseconds compared to microsecond.
So how is a file parsed? Is it read into memory in some way then parsed, or parsed as it is read in? Whatever the case may be, one certainly sounds a lot more braindead then the other.
But that aside, bringing hardware into it muddies the waters without need. It's heretofore been proven that a large file takes a large amount of time to parse. The particulars of that fact aren't as important as the fact itself in this conversation.
Cheers,
BDKR
- Maugrim_The_Reaper
- DevNet Master
- Posts: 2704
- Joined: Tue Nov 02, 2004 5:43 am
- Location: Ireland
Don't see it to be honest - I was simply making a statement that design often outweighs parsing as a optimisation concern. Take that with a pinch of salt since I think most of us are aware of how to optimise the parsing end of things.The "OOP/Procedural parsing" part really worked me! I ultimately decided that you must mean "OOP/Procedural, parsing". Notice I added the coma? We may want to consider how you define parsing in this case.
Simplest example I can think of seeing recently was a Data Object holding a collection of Transfer Objects (each representing a row of data from a database). Not all member objects were used on each request - but all were loaded from the database on every request regardless. I find such issues as these far more common than parsing problems - but then this is me talking; my experiences have nothing to do with your own...
Wouldn't dream of it - they are two different things...Yes, it is a design issue. But be careful here that you don't equate the design issue with parsing concerns too closely.
I'd also list expected behaviour as a design as a concern. If you don't expect a heavy database load, et voila it happens - its probably a design issue. We probably most of us know all the solutions to such scenarios (lazy loading, object marking, et al.) but they are out there. Lots of folk don't even notice when they occur...On the design side, the primary concern is normally how it will affect the understanding of, management of, and maintenance/extension of the object over time.
I'm just plain misunderstood...no fair. Sniff. I'm going to go sulk for a while...
- AKA Panama Jack
- Forum Regular
- Posts: 878
- Joined: Mon Nov 14, 2005 4:21 pm
No, I haven't may conclusions about you.BDKR wrote:I suspect you arrived at a conclusion about me based on just the comments above.AKA Panama Jack wrote:This is one of the main problems I have with most programmers today.BDKR wrote:I do not agree with your contetion that parsing is of little concern. Did you look at the benchmarks on Panama Jack's site? Notice the increase in speed for Adodb when an accelerator was used? The percentage of increase is greater then 3 fold! Parsing makes a huge difference.
Parsing can be a huge and costly operation. There is no way around that fact. Just use an accellerator and be done with it.We have the memory and accellerators so who cares about how fast the code is right now. Throw it together and if we need speed make sure the server is using some kind of PHP accellerator. If a programmer is trained to create small and efficient code from the start then they can create the code just as fast if not faster than someone who just goes for what works at the time. Most programmers that are churned out from most courses are not trained in making the code tight and efficient but they are definately trained on making it look PRETTY. That gets my panties in a bunch.
You can make the code easy to follow and well laid out while making it tight and fast.
I am from old school when it comes to programming (nonsense you shouldn't read). I started back in the days when it was all about getting the most out of as little code as possible because of the limitations in memory.BDKR wrote:Not knowing your experiences in development, I'll guess based on what you've done and said that you had to deal with concerns that relate to performance.
That's not really what I was talking about.BDKR wrote:Yes and no. Let's continue the analogy a little and extend it to the space required to store them. At my house, I have a one car garage. What that means is that while my car was being turbocharged and parked in there, the amount of space was severly limited. That meant being thoughtful of where I place/stored all of my tools so that I still had room to work. Often times I had to move something out of the way while working on a certain part of the car. There often were times when I needed to stop work and just clean up before things got too out of hand. Like putting away tools that weren't being used (sound like garbage collection?) as an example.AKA Panama Jack wrote: Well, it's great to have all of those extra tools even if you don't currently need them.
It's like ADOdb Lite, you only upload to the server the sections of the library you need and not the entire library. No need to install drivers for every database ADOdb Lite supports. Just install the ones you will be using. The same thing about the modules in each driver. You don't need to install the date module if you are never going to use it.
This is the problem with a program like ADOdb as it has a ton of features and abilities. Most of the are esoteric in nature and little used by many applications. Unfortunately you are stuck with them and the inherent memory overhead and slow instance creation. That is like working on your car and holding half a dozen tools in both hands that you are not going to need when working on your car.
A package like ADOdb Lite is like having all of the same tools as ADOdb but you don't have to hold all of them at the same time to work on your car. You leave the ones you will not use in the tool box.
- AKA Panama Jack
- Forum Regular
- Posts: 878
- Joined: Mon Nov 14, 2005 4:21 pm
They are two different things but I, personally, think they go hand in hand with each other. A good programmer will sit down and work out the parsing based upon the design of the project. I work out the design and then start on the parsing that is needed and then change the design based upon the limitations or abilities availible from the language. If I don't I end up with something that follows the design but ends up being an unwieldy, slow mess.Maugrim_The_Reaper wrote:Wouldn't dream of it - they are two different things...Yes, it is a design issue. But be careful here that you don't equate the design issue with parsing concerns too closely.
- Maugrim_The_Reaper
- DevNet Master
- Posts: 2704
- Joined: Tue Nov 02, 2004 5:43 am
- Location: Ireland
It is. Everyone else is doing the opposite. Yes, you are an exception. 
j/k - They are two concerns, and there is some play between them. I just notice a lot that often design is neglected in discussing optimisation (not saying everyone does, but not everyone has the experience to tell them otherwise).
j/k - They are two concerns, and there is some play between them. I just notice a lot that often design is neglected in discussing optimisation (not saying everyone does, but not everyone has the experience to tell them otherwise).