|
Post by Obviousman on Oct 13, 2011 18:05:14 GMT -4
Hehe - some parallels here.
This reminds me of the F-111 and what I see is possibly happening now with the F-35 programme: trying to make something be all things to all people (which funnily enough, is actually achieved with some software I mention further below). The F-111 was meant to fulfil requirements for both Navy and Air Force... and failed. Once the Navy version was cancelled and the need for competing (and often mutually exclusive) requirements dropped, it went on to be a highly successful aircraft. I think the same could be happening with the F-35: a conventional fighter version, a STOVL version, and a CTOVL version, seeking to fill multiple roles in multiple services. Sometimes you simply need a specialised tool and have to accept the cost that comes with it.
Not quite the same but similar in some respects is the roll out of some software (Patriot Excalibur, an aircrew resource management tool) to our squadrons. Without going into detail, the software is highly configurable and adaptable. When I roll it out to the squadrons, they invariably told me that things had to be a certain way or they needed to use it in a particular way... despite my recommendations as to the methods in which it should be employed. After a month or so of using the software (gaining experience) they'd come back and say "Well, yes, this is what we asked for but it is not what we want" and come up with far more appropriate requirements for me to configure to.
|
|
|
Post by ka9q on Oct 14, 2011 3:07:32 GMT -4
Speaking of elegance and computers, I just read that Dennis Ritchie, originator of the C programming language and one of the co-creators of the UNIX operating system, passed away last weekend at age 70.
Unlike Steve Jobs, who I never met, I knew Dennis Ritchie. Not well, but I was at Bell Labs Murray Hill, NJ in the early 1980s where I had occasion to talk to Dennis a few times. I'd say 'hi' in the hallway or send an email or drop by his office with a question, and it didn't matter that my department and his were completely separate, or that it wasn't his job to answer questions from young eager computer geeks like me. He was that way with everybody.
Isaac Newton famously said that if he'd seen farther than others it was because he'd stood on the shoulders of giants. Dennis Ritchie is one of the giants on whose shoulders Steve Jobs stood (and made a ton of money), and it really bugs me that most people outside the computer field will never even know his name.
|
|
|
Post by tedward on Oct 14, 2011 10:10:17 GMT -4
Just clocked this for DR Wosisname. Another nail in his/her/their expertise. LinkyAre we dead yet?
|
|
|
Post by JayUtah on Oct 14, 2011 11:18:13 GMT -4
He's been so very quiet since his thread went under moderation.
|
|
|
Post by JayUtah on Oct 14, 2011 11:31:07 GMT -4
Unlike Steve Jobs, who I never met, I knew Dennis Ritchie. That's fantastic. I've had the opportunity to meet many giants of the computing world, but sadly not him. I do rely greatly on his wisdom, though.
|
|
|
Post by JayUtah on Oct 14, 2011 12:47:50 GMT -4
Although Jay has a point when you consider how important SIMD machines have become in recent years. They don't change the order of your algorithm, but they do divide N by some fixed constant. Yes, that's why Google's software infrastructure is robust. Their map-reduce strategy is transparently scalable to any number of processors. And yes, if rules of thumb didn't have well-founded exceptions, they'd just be rules. The way Pike originally phrased it was, "...N is usually small." I paraphrase him for my clients because they're all apt to believe they are the exception to that rule.
And I think you'll understand better where I'm coming from if I go back and clarify the context.
I have some clients for whom I provide engineering solutions. They have a problem, and they want me to solve it for them. I have other clients for whom I provide engineering expertise. That is, they have their own staff and they want me to train them, or help them get past some hurdle or correct some bad habits.
I've worked with some truly colossal Ns. Really. Eric Raymond cites nuclear simulation as a problem for which N is truly large and for which a proper algorithm is a must. That's what I was doing for the Dept. of Energy. Very large Ns on very large, tennis-court size computers.
However most of the clients for whom I train and consult (as opposed to simply build for) are in the business sector. These are people with small Ns and a staff who hasn't been through the crucible of scientific or embedded-systems computing. So much of what I've said the past few days applies most directly to them.
Yes, for most reasonable Ns, the fast Fourier algorithm finishes orders of magnitude faster than the discrete method. That's an example of when a better algorithm pays off. However FFT and DFT have about the same level of cognitive complexity. That is, in terms of understanding, implementing, and testing an algorithm, they are roughly in the same class. Just about as much can go wrong in one as in the other. So in short, in this case, you'd need a good reason not to employ FFT.
The point of explaining it the way I did earlier is that rules of thumb set the default case properly. When in doubt, use brute force. That's a rule of thumb. You break it when you have gathered information that says you should, but not just on a whim.
Even the difference between simple algorithms that scale poorly and complex algorithms that scale well, the difference may not matter. If it takes 1/10 second for the complex algorithm and a whole second for the poorly-behaved algorithm for some real-world problem, that grossly elongated time may not hurt anyone because both may be orders of magnitude faster than what might affect the problem.
In contrast we have scientific problems that may take 300 hours on a terascale computer. Shortening it to 270 hours gives us the answer at least a full working day sooner, which may be extremely significant. And when supercomputer time costs $100 an hour in electricity costs alone, saving $3,000 on the project budget may be worth having the programmer spend an extra week tuning and testing the algorithm.
I had one client who collected data from field stations throughout the day, then ran a consolidation program on an ordinary dual-core server, then merged the new data into a historical database. The programmers for the consolidator came to me asking what they could do to speed up their portion of the job. It was taking 40 minutes to run, and they had done everything they could think of.
We profiled their code, extracted some computational kernels to run some performance experiments one, replaced a couple of their algorithms with simpler, brute-force algorithms (since their more intelligent algorithm used an intermediate store whose I/O was a bottleneck), and shaved only about 10% of their average run times.
What I came to understand later was that the merge into the historical data warehouse was taking 22 hours to run during each 24 hour cycle. The consolidation guys, whose part was only 40 minutes for an average data set, were being pressured to tighten up their act by a manager who seriously believed they should be able to cut the run times at least in half to provide the data warehouse people a longer daily maintenance window. The merge operation, he believed, simply took as long as it took, and those custom programming guys were incompetent and lazy.
In two days my DBA was able to cut the merge time down to less than 10 hours, simply by tuning their RDBMS properly.
The moral here is that you are looking for performance issues in terms of orders of magnitude for most common problems. If some part of the program dominates the others by an order of magnitude or more, that's where you should be looking to trade complexity for performance.
|
|
|
Post by longfuzzy on Oct 14, 2011 14:40:55 GMT -4
Back on page 30, post #437: “If you read my post at 390 Luke you will see that according to NASA's OWN Apollo 11 voice transcript, the alleged Eagle leaves the surface of the moon from lunar coordinates; N 00 43 24 and E 23 26 24. This is a fact per the Apollo 11 voice transcript and is not in dispute. The LRRR is located at N 00 41 15 and E 23 26 00. This is a fact not in dispute and nor is our knowledge that these coordinates were provided to Lick Observatory as the LRRR's location on the evening of 07/20/1969. As records not in dispute show the Eagle to have taken off from a position more than 0.67 miles distant from the LRRR, we may conclude that Neil Armstrong did not leave the LRRR upon the surface of the moon. We can say this with absolute certainty because both sets of above referrenced coordinates are not in dispute. Not by NASA itself. As Neil Amstrong was not close enough to the LRRR's coordinates to have left it there, and as NASA knew the coordinates of the LRRR on the evening of 07/20/1969, I conclude the LRRR was placed by unmanned means prior to 07/20/1969. I suggest simply going to the referrenced part of the Apollo 11 transcript Luke and noting the Eagle's coordinates as I have above. I hope that is helpful.” So, Lick points their laser at ‘X’ (41 15 x 23 26), and after some fine tuning of the software, gets a return. But since the LM is at ‘Y’ (43 24 x 23 26 24), (.67 miles distant) this is proof of the hoax. However, the laser beam, when it hits the moon is about 4 miles (7Km) across, ( www.lpi.usra.edu/lunar/missions/apollo/apollo_11/experiments/lrr/) or 2 miles radius. So if the LLLR is within 2 miles of ‘X’, then the Lick guys will get a return. This is well within the .67 mile distance quoted above. The LLLR and LM could be at the exact same position. There is no evidence of a hoax under these conditions. Is this correct? -LF
|
|
|
Post by trebor on Oct 14, 2011 15:18:02 GMT -4
Hi Doc, There have been several replies to that point... perhaps you should go read them before merely repeating yourself.
|
|
|
Post by ka9q on Oct 14, 2011 15:22:26 GMT -4
That's fantastic. I've had the opportunity to meet many giants of the computing world, but sadly not him. I do rely greatly on his wisdom, though. Dennis was an amazingly modest as well as friendly, wise and helpful guy, a perfect example of the rule that no real genius ever thinks of himself as one. This is from his own web page biography: Compare that with the countless people who are always concerned that we fully appreciate just how bright and accomplished they are. There's just something very unfair about a world in which the people who methodically lay the solid foundations for an entire new industry are rarely known much beyond their own professions while the "captains" of those industries, who owe their entire wealth and success to the foundation-layers, are household names. Yes, Steve Jobs was also an important, highly creative person whose loss is also being keenly felt in the computer world. I just wish the public (to say nothing of juries on patent cases) better understood the enormously collaborative nature of science and technology, and how many smart, driven and often selfless people play a part in everything we have. Especially in the US there's the myth of the lone misunderstood genius who labors alone in his garage for decades until he produces a totally original breakthrough that breaks all the "rules" and changes the world. It simply doesn't happen that way anymore, if it ever did.
|
|
|
Post by ka9q on Oct 14, 2011 15:28:24 GMT -4
I hope that is helpful.� [....]Is this correct? -LF No, it's neither helpful nor correct. Might I suggest that you go back and actually read and understand what others have tried to explain to you in this rather long and unproductive thread?
|
|
|
Post by tedward on Oct 14, 2011 15:48:00 GMT -4
Records stuck.
|
|
|
Post by JayUtah on Oct 14, 2011 16:20:28 GMT -4
I wondered how long after his JREF thread went moderated he'd be back here.
|
|
|
Post by frenat on Oct 14, 2011 21:12:02 GMT -4
What a sad person. He has nothing better to do than get banned repeatedly and ignore the answers to his nonsense?
|
|
|
Post by Obviousman on Oct 14, 2011 21:21:38 GMT -4
What a sad person. He has nothing better to do than get banned repeatedly and ignore the answers to his nonsense? That's a good point, and got me wondering: the person has repeatedly demonstrated an inability / unwillingness to conform to the various rules in order to discuss the subject. Do you think the same disregard for rules extends into their offline world? Are they likely to be someone who thinks rules apply to other people and not them? If so, do you think this is typical behaviour (i.e. can't obey simple rules on a forum therefore unlikely to adhere to rules of society or law)?
|
|
|
Post by abaddon on Oct 15, 2011 6:05:41 GMT -4
Oh dear FSM, not another sock.
He is now posting in another thread at JREF BTW
|
|