Join us in Chat. Click link in menu bar to join. Unofficial chat day is every Friday night (US time).
0 Members and 1 Guest are viewing this topic.
Due to moors law[/b] and the current level of cpu power on desktops, for most apps the use of C is not critical. For smaller computing devices tight fast code makes all the difference. A program written in QBASIC running on fast desktop runs faster than any code on a super computer built 25 years ago. So in terms of speed C has an edge measured in years. I use freebasic which is compiled not interpreted. in a couple of years time it will be as fast as your C program is today....food for thought.
C on an old machine is slower than Freebasic on a modern machine...C's speed advantage is measured in years...which is a good way to look at it
Print "hello world"
main:print "hello world"print TimerDim counter As Integer : counter = 0Dim a As Double: a = 0.0Dim b As Double: b = 0.0a = Timerdo while counter < 1000000 counter+=1loopb = Timerprint b-ado while counter > 1 loopend
REF Vision, all the methods are out there, a computer is capable of seeing and understanding the world through sight far better than any human. Its a matter of processing power and developing the technology (best written in C due to the computation required).
we should do a speed test AND SETTLE THE MATTER, write a short simple bit of code and time it, something likeinput "press enter to start"; a$for i =1 to 10000:print i:next iprint "done"
I'm 100% sure this thread is off topic now but really who wants to get back to the topic?
it's a very big misunderstanding to think that robots should be shaped like humans. It's inefficient and wanting that just reveals a sort of God complex, which might be cured with regular doses of lithium and beta blockers.
Nonsense.It takes a certain way of thinking to program!It helps immensely to know how the hardware works, but once you know the logic behind coding, any new language is just adding meat on the skeleton.
I'm usually accused with having a backwards programming stance: I learned asm first and now I can't for the life of me understand why anyone would use C.
Hi,Yeah, I had the same thoughts of the claimed amount of data and I think it's sad when a grown man is so lonely, that he feels a need for exaggerating to such levels, that the "artistic freedom" is not only visible, but hitting you so hard that it takes a blind person to eat it.Posts telling "I have no experience with hardware" is fading into "I did a linefolliower in eighty-something" etc. etc.If a lie is to be believable, it has to be at least barely plausible - too bad it's so hard to post some of the code - oh, I forgot... No lesser part of the code will reveal the ingenious brain So, this brain, who's single parts hasn't got anything worth coming after, magically wakes and works in ways that the software wouldn't reveal, when all the text files flies around.I don't find it hard to swallow, not at all... I simply don't believe a single bit of it unless some proof is posted (i.e. not what some scientist has posted about some theories, but real code that can be verified.