OT - PC's: a tool in search of a need



#2

Reading the thread about the IBM 5100 made me recall the early days of the PC. The commercial showed a real estate agent and a farmer doing calculations - in BASIC. These were the days before spread-sheets had been invented.

Basically, these machines where tools in search of a use as I recall. Am I right, or just imagining things?

To me, the biggest example is the home PC. From the beginning, it seemed that the PC makers were trying to get computers into the home, but they just couldn't seem to sell them. Those pesky users kept putting them on their desks at work instead.

As I recall, it was the CD-ROM and "multimedia PCs" that finally cracked the home - mostly because parents bought them for their children. Having a word processor and computerized encyclopedia gave the kids an advantage at school.

Circling back to calculators, someone pointed out recently that their old HP calculator (it may have been a 67) was as useful today as it was when he bought it. I think that's quite true. An HP 41 from 1980 is a lot more useful today than a dual-floppy IBM PC with PC-DOS 1.0.

Dave


#3

I think games played a big role in this too. In fact they still are, computers have been adequate for most tasks they've been called on to do for quite a few years now & games are the big technology driver.

- Pauli


#4

One of the early proposed "uses" of PC's in the home was "you can store your menus on them." Remember that? Lame.

Games were certainly one of the reasons that computers in the home became more prevalent; Internet was another huge reason. But games seem to haved jumped to dedicated gaming platforms these days. Remember all the Egghead software stores of the 80's and 90's? I haven't seen one in years, but I used to buy interesting computer-based games there. I think Flight Simulator 2004 was the last game I bought for my PC, and I still enjoy it.

Steve Jobs was in the news last week for criticizing Adobe and its Flash, saying it was made for the PC era, and with the advent of the iPhone, iPod, and now iPad, apps like Flash need to be updated to support touch screen and mobile computing.

Is he right? Have we passed into a new era of computing, defined by wireless touchscreen technology and apps you must download from hosts like iTunes instead of writing your own applications on your device? I was interested in the iPad until I learned that you really can't "roll your own" software, like you can on the PC.

Will the PC, as we know it, become a relic like the HP-65? I think the answer is yes, but I think the business need for the PC will be around for years to come. I can't see businesses doing their payroll on an iPad.


#5

Quote:
One of the early proposed "uses" of PC's in the home was "you can store your menus on them." Remember that? Lame.
The one I remember most in the advertising was that you could use the PC to balance your checkbook. Talk about lame!

Quote:
Steve Jobs was in the news last week for criticizing Adobe and its Flash, saying it was made for the PC era, and with the advent of the iPhone, iPod, and now iPad, apps like Flash need to be updated to support touch screen and mobile computing.

Is he right?


I hate touch screens!

Quote:
I was interested in the iPad until I learned that you really can't "roll your own" software, like you can on the PC.
A co-worker (another engineer) was extremely interested in the iPad for the same reason: he though you'd be able to "roll your own" software, but his interest in it evaporated when he found out you can't.

Edited: 2 May 2010, 11:08 a.m.


#6

Quote:
I hate touch screens!

I haven't used them enough to have an opinion, for or against. On a calculator like the TI-NSpire, I think a touch screen would be preferable to the current navigation buttons. But to do the type of detailed things you would want to do on the NSpire, I think a stylus would be necessary.

I spent this past week training to be a census enumerator, and during the next few weeks we will visit homes that did not return their mailed census questionnaire. The Census Bureau originally planned to use handheld computers, manufactured by HTC and marketed by Harris in the US, for this operation. But, because they couldn't get the software ready and tested in time, we will be using paper forms, like all previous census's (we did use the handheld computers a year ago to record the location of every housing unit in the country, to create the mailing files, and they worked pretty well for that operation). Anyhow, the gal I sat next to during training had an iPhone. She showed me how you use its touchscreen to do things like run apps or zoom in/out or expand/contract the screen. I noticed the calculator app and opened it. It was a basic 4 function calc, but it seemed to have a lot of significant digits. If you tilt the device sideways, it turns into a scientific calculator. The touchscreen worked OK for that app.

It was interesting to use the iPhone for a few minutes, but I'm not going to run out and buy one. I'm sure she pays a lot more per month for phone and wireless Internet than I do for my basic cell phone (about $30), and I just don't need Internet access except on my desktop computer at home. But the younger generation is obviously enamored with these devices and is willing to pay the monthly fees associated with them.

#7

Quote:
A co-worker (another engineer) was extremely interested in the iPad for the same reason: he though you'd be able to "roll your own" software, but his interest in it evaporated when he found out you can't.

Your co-worker needs to do a bit of research. Where do you think all the iPhone/iPad apps come from? Apple? No.

If you want to RYO iPad apps you just need to download the free SDK, you must have a Mac to use the SDK, and if you want to publish the app for yourself or others then you need to pay $99/year.

So you can RYO software, but at a cost.

That all said, you can write very interesting HTML5/JavaScript apps for the built-in browser at no cost (except your time and possibly sanity). Gmail is a great example of this. It is actually better than the on-board email app.


#8

Egan, I'd say the co-worker did the research (as did I) and he probably wasn't interested in paying for a separate Mac computer to develop on or paying $99 per year to be able to run his own apps!

To develop and run your own apps on the PC, you have to do neither of these two costly things.


#9

I understand that, but that is not what was stated. It was simply "can't".

BTW, the $99/year is only to publish. In principle, if you make no changes to your app, then you do not need to pay again. In practice, the OS changes so frequently, that you may need to recompile or change your app.

If you jailbreak your device you can develop at no cost and there is no restrictions. There is an alternative tool-chain. I do not have the exact details, but its only a Google search away.

BTW, by year-end there will be alternatives. The appeal with Apple products is the size of the audience. If you want to make a buck writing apps you are going to have to pay the Apple tax.

Edited: 2 May 2010, 2:09 p.m.


#10

Quote:
Egan, I'd say the co-worker did the research (as did I) and he probably wasn't interested in paying for a separate Mac computer to develop on or paying $99 per year to be able to run his own apps!

To develop and run your own apps on the PC, you have to do neither of these two costly things.


Right.
Quote:
I understand that, but that is not what was stated. It was simply "can't".
Apparently you can't do it on itself like you can a 41 or a PC. You have to have another computer, one he doesn't have and doesn't want to buy.
#11

Quote:
The appeal with Apple products is the size of the audience. If you want to make a buck writing apps you are going to have to pay the Apple tax.

No, not for me. I couldn't care less how many iPads Apple sells. My interest is developing something for myself to use and play with on this new and appealing platform, not something I want to sell to the world. I've done this on virtually every small computer I have ever used, from the Apple ][ to the Mac 128k to the Mac SE to the Mac PowerBook, and finally to the PC.

Apple obviously is not addressing this particular type of user, probably because there aren't that many of us and they can make millions with their iTunes app store. More power to them, but I choose not to be a participant.


#12

Quote:
Apple obviously is not addressing this particular type of user, probably because there aren't that many of us and they can make millions with their iTunes app store. More power to them, but I choose not to be a participant.

What Apple is doing is not unique. I can think of very few embedded devices that were designed to be programmed on-board. I cannot think of a single hand-held device, sans a calculator, that was designed to be programmed and not just used. I cannot think of a single mass market mobile phone, gaming device, MP3 player, wristwatch, or PDA that has the ability to be programmed on-board. I know some exist, but they never really penetrated the mass market (e.g. Linux-based Zaurus).

As for cross-development, I think Apple is taking a cue from Nintendo, Sony, and other mobile platforms where you pay to develop and publish and it is controlled. If you think Apple is bad, read up on the Gameboy and PSP. Apple's model has made them a lot of coin that others (MS and Palm) envy. Others will follow.

That all said, you are 100% correct. Apple is not interested in the hobbyist market. There will be a number of Andriod-based tablets soon. I think you'll find them to your liking. To each their own.

BTW, I am not defending Apple, just stating the facts. I have an iPad, and have no interested in creating apps for it. I purchased it as a replacement for paper media. The form-factor and battery life make it the perfect travel companion for me to catch up on my reading and Tivo'd videos. I spend ~80 hours a month in public transportation (writing this from the airport) and a notebook is just too cumbersome for digesting media. The lack of a stylus is my only beef. I cannot create content in an appealing way without a stylus. I'll be dumping the iPad for the first decent tablet with both touch and stylus input.


#13

Thanks, Egan. It will be interesting to see what the HP Slate has to offer in this regard.

Don


#14

Rumor is, HP killed it. The Slate was/is Windows-based. Given the recent Palm purchase perhaps Palm has a better tablet.

#15

Quote:
The one I remember most in the advertising was that you could use the PC to balance your checkbook. Talk about lame!
That promise actually did appeal to me in those days. After all, personal computers back then really didn't do much - OTHER than fire up the imagination of those who had a clear vision of the future. Here's a great flashback on it thanks to Mr. Horn:

THE PROGRAMMABLE POCKET CALCULATOR OWNER: WHO DOES HE THINK HE IS?
by Craig Pearce (311), Berwyn, Illinois

I'm sure many will enjoy this trip down memory lane. ;)

#16

Quote:
Have we passed into a new era of computing, defined by wireless touchscreen technology and apps you must download from hosts like iTunes instead of writing your own applications on your device?

IMHO, current touch-based devices are most useful for communications and consuming content (web, games, video, music, apps, etc...), i.e. limited purpose. General purpose systems (for now) will still rein for creating content, especially for limited purpose devices. I would not consider the iPad suitable for writing a novel, a web page, or for writing a program.

IMHO, computers (i.e. anything with an OS) have moved beyond their primary purpose of increasing productivity to consuming content (reduction in productivity :-). As I look around my home at everything with an OS (embedded systems too), I'd have to say the majority are used for communications or to consume some type of content.

Today there are more and more consumers of information and computers to facility that consumption, so your speculation that this may define computing in the future may be correct. But, I think it will more accurately define communications and media.


#17

Quote:
IMHO, current touch-based devices are most useful for communications and consuming content (web, games, video, music, apps, etc...), i.e. limited purpose. General purpose systems (for now) will still rein for creating content, especially for limited purpose devices. I would not consider the iPad suitable for writing a novel, a web page, or for writing a program.

IMHO, computers (i.e. anything with an OS) have moved beyond their primary purpose of increasing productivity to consuming content (reduction in productivity :-). As I look around my home at everything with an OS (embedded systems too), I'd have to say the majority are used for communications or to consume some type of content.

Today there are more and more consumers of information and computers to facility that consumption, so your speculation that this may define computing in the future may be correct. But, I think it will more accurately define communications and media.


Egan, this is very well put. In my business, engineering design, we still use computers for creating (productivity), instead of consumption (as you said, the opposite of productivity). It almost seems that the off-the-shelf computers more and more come with features that interfere with productivity rather than enhance it.
#18

Quote:
Is he right? Have we passed into a new era of computing, defined by wireless touchscreen technology and apps you must download from hosts like iTunes instead of writing your own applications on your device?

Only if your a follower of "The hippy nation" ie Apple. Otherwise for 550$ you can get a Archos 9 with Windows 7, enjoy Flash, multitasking, USB and other useful features that the iPad lacks.

Currently typing this on the Archos 9, had one for a few months now. I am waiting for other devices that are similar to come out. There should be a flood of them in the next year.

Dimitri

#19

Quote:
To me, the biggest example is the home PC. From the beginning, it seemed that the PC makers were trying to get computers into the home, but they just couldn't seem to sell them. Those pesky users kept putting them on their desks at work instead.

My first computer was an Apple ][+ that I obtained in 1985 at age 17. I had a need. I wanted to program, and access to my school's computer labs was limited. Many starting with the Altair had the same need.

However, I'll agree that programming was not the targeted need by late 70's and early 80's commercials (or today's commercials). My parents and many of my friends parents had computers primarily for personal finance and word processing. Or at least that was the need. I think my father logged more time playing flight simulations. My mother did use the computer for word processing (work (legal) related) as did I for papers. That was another need--a powerful typewriter replacement.

IMHO, many just wanted to explore the possibilities of having a computer, as you put it, "a tool in search of a need".

Apple sold about 5-6 million Apple IIs. Commodore sold about 30 millions C64s. The only thing I'd ever seen a C64 user do is play games. IMHO, it was games that was the ultimate need. The use of PC's for gaming contributed to the Video Game Crash of 1983. There is no doubt that it was games that got more computers into people homes. It's applications that sell computers, not the HW. The platforms with fewer games failed.

#20

PCs were never tools in search of a need. By the time IBM introduced the first PCs, people had been using Intel 8080-based machines running CP/M, and of course Apple IIs, for years, and there was a lot of productivity software for them -- the big names I remember were WordStar, dBASE, and VisiCalc (word processor, flat-file database, and spreadsheet, respectively). The Apple II was also fast becoming a popular gaming platform, in addition to having some decent productivity software available for it -- VisiCalc was originally written for the Apple II.

The problem with the pre-PC machines was that they all used 8-bit CPUs with 64 kilobytes of address space, and by the late '70s, that was becoming a problem as software became more and more sophisticated, and people needed to be able to work with documents that didn't fit in RAM all at once. Some machines used bank switching to get past the 64 kilobyte limit, but that makes life pretty miserable for software developers. A larger address space was what was really needed.

The IBM PC was eagerly anticipated; the market was ready for a mainstream 16-bit machine. Many considered the PC a disappointment when it arrived, because other, superior CPU architectures existed, but the 8086 architecture had the advantage of assembly language source code compatibility with the 8080, so software vendors could port their CP/M products over to DOS relatively easily. This was a big, probably decisive factor in the rapid market acceptance of the PC.

For hobbyists the early PCs weren't much fun, because they lacked sound, color, graphics, and were much more expensive than the 8-bit machines... But in offices, they started popping up like mushrooms. IIRC, in Europe, the PC overtook the very popular Commodore machines (the 3000, 4000, and 8000 series) in terms of market share in about a year.


#21

Hi Thomas,

That recollection with respect to the "mushrooming" of 8086 machines matches my memories. Up to around 83 or 84, the Commodore 64, the "trash-80" and the Apple II were the ones to have. By 86 or so as I remember, the 16-bit machines were proliferating. The first Macintosh was cute, but it wasn't the workhorse until a bit later. By 1990, my university had upgraded to a 286 lab that was far and away larger than the Mac lab that had been put in in '88. The Macs had some good features, but the GUI aspect just wasn't a *productivity* enhancer yet...

I will say that one of my college hall-mates got one of the big (and I mean big!) new macs in 1990. He had "MooseProofs" or something that would talk, and he had a flight sim on it. I think that was the beginning of a very strong rally for Mac, which it ultimately lost when Windows was perfected. By 1995, the engineering and science users of the Mac were finding themselves out of luck and had to switch to PCs either because their employers (hospitals, universities) were no longer supporting MAcs, or because they needed software that wouldn't run on MAcs...

Looking back, I guess we should see the changes in the past 10 years as poitively glacial. I can happily work with year 2000 software for almost everything--even engineering software. Some of the 3D tools have gotten much better, but the 2D tools haven't really improved since 97 or 98. Word processors are worse now. And heck, you can load programs from 1995 with no trouble! Compare that to the changes from 1983 to 1988 and it is bewildering...

Of course my memories as to dates might be fuzzy here and there...

Edited: 2 May 2010, 9:29 p.m.


#22

Quote:
That recollection with respect to the "mushrooming" of 8086 machines matches my memories. Up to around 83 or 84, the Commodore 64, the "trash-80" and the Apple II were the ones to have. By 86 or so as I remember, the 16-bit machines were proliferating. The first Macintosh was cute, but it wasn't the workhorse until a bit later. By 1990, my university had upgraded to a 286 lab that was far and away larger than the Mac lab that had been put in in '88. The Macs had some good features, but the GUI aspect just wasn't a *productivity* enhancer yet...

My recollection/belief is that after the IBM PC came out, the biggest market share was non-technical corporate (i.e., not home) users. And the whole mentality of "no one every got fired for buying IBM" was still in full force. The Mac, for all it's superiority, was seen as a cute toy for geeks. Few business people wanted one.

Quote:
Looking back, I guess we should see the changes in the past 10 years as poitively glacial. I can happily work with year 2000 software for almost everything--even engineering software. Some of the 3D tools have gotten much better, but the 2D tools haven't really improved since 97 or 98. Word processors are worse now. And heck, you can load programs from 1995 with no trouble! Compare that to the changes from 1983 to 1988 and it is bewildering...

Processor speeds haven't improved much in 10 years, but disk capacities have continued to grow. The trend now is towards putting more CPUs (cores) in the box, and it will be exciting to see the raw horsepower start to increase again. I've heard that Intel is already testing a 48 core CPU.

Not all applications can benefit from more cores, but many others can. My point is that interesting times lie ahead.

Another big improvement over the past 10 years has been graphical power. Considering that pretty much every desktop has a high powered 3D engine on the graphics card, it's a wonder why we the desktop is still such a static 2D beast.

Quote:
Of course my memories as to dates might be fuzzy here and there...

I know what you mean. Now before I go, can someone remind me what my wife's name is :)

Dave


#23

Quote:
Considering that pretty much every desktop has a high powered 3D engine on the graphics card, it's a wonder why we the desktop is still such a static 2D beast.

What, you think we will soon be wearing red/blue glasses in front of the computer screen?
#24

Quote:
My recollection/belief is that after the IBM PC came out, the biggest market share was non-technical corporate (i.e., not home) users.

Non-technical/corporate users are exactly the market that the IBM PC was designed for. Hobbyists and home users didn't start to warm up to the PC until a few years later, when clone makers started driving prices down.


The Mac didn't gain traction early on because (1) it didn't run the productivity applications that were driving IBM PC sales, and (2) because the lack of clone makers kept Mac prices high while PC clones got cheaper and cheaper.

Quote:
And the whole mentality of "no one every got fired for buying IBM" was still in full force.

In the small- to medium-size business segment, buying IBM computers in the pre-PC days *would* have gotten you fired, for the same reason that it's a bad idea to buy an 18-wheeler for pizza delivery. Even if that 18-wheeler is a Peterbilt.

Edited: 3 May 2010, 6:43 p.m.


#25

Quote:
(2) because the lack of clone makers kept Mac prices high while PC clones got cheaper and cheaper.

One very forgotten thing about IBM's PC is that IBM gave the technical specifications manual out for a fee, which allowed clone makers easy reverse engineering of the system as it was well documented. I know I maybe wrong, but I think IBM at its core never wanted the PC market.

Although IBM sued people who outright copied their BIOS. They had the data, from schematics to the actual BIOS code out there. Only reason I could see them doing this is to standardize the computer market and eliminate the multiple standards. Which would make their lives easier to support their corporate customers and interfacing instead of dozens of half documented standards, only worry about the one they created.

In the next few days I'll post on my website the documents as I have them on PDF.

Dimitri


#26

Quote:
One very forgotten thing about IBM's PC is that IBM gave the technical specifications manual out for a fee, which allowed clone makers easy reverse engineering of the system as it was well documented. I know I maybe wrong, but I think IBM at its core never wanted the PC market.

I think the idea behind documenting the hardware was primarily to help third parties create add-in hardware, and profit a bit from selling the specs... The idea was probably inspired by the success of the Apple II, which owed a lot to third-party plug-in cards. (*)


Unfortunately for IBM, a lot of plug-in board makers decided not to bother with paying the fee and reverse-engineered the ISA bus instead, and IBM ended up losing control of ISA. When IBM released the PS/2, they tried to regain control by making the MicroChannel specification depend on a patented chip set that you had to buy from them, but by then it was too late; the industry standardized on EISA and PCI instead and IBM was officially demoted from leader to just another player.


The PC market was certainly a bit of a culture shock for IBM -- officially endorsing third-party hardware instead of trying to keep everything closed and proprietary was new territory for them!

Quote:
In the next few days I'll post on my website the documents as I have them on PDF.

Cool! Looking forward to it. :-)

- Thomas

(*) Apple, of course, managed to shut clone makers out of the Apple II market because it was impossible to create a 100% compatible clone without the Apple II ROMs. Perhaps IBM envisioned keeping the PC market to themselves in a similar manner, but because the PC's BIOS ROM was much simpler and had cleaner APIs, it was possible to simply re-create from scratch, duplicating its functionality while sidestepping IBM's copyright.


Edited: 4 May 2010, 8:19 p.m.


#27

Quote:
Perhaps IBM envisioned keeping the PC market to themselves in a similar manner...

I don't think so. I remember reading back in the 80's that it was deliberate corporate policy to make the PC "open" architecture. I think what happened to IBM was they never anticipated two things: how low competition would drive the price, and how the IBM "name" would lose it's draw in the face of so many seemingly excellent alternatives in the marketplace.

I say "seemingly", because my experience has been that "IBM quality" was real; you could fire up most any XT or AT today if the magnetic media hadn't failed. Problem was, of course, that the technology changed so fast the longevity of IBM components became moot.

Edited: 4 May 2010, 11:42 p.m.

#28

Quote:
Perhaps IBM envisioned keeping the PC market to themselves in a similar manner, but because the PC's BIOS ROM was much simpler and had cleaner APIs, it was possible to simply re-create from scratch, duplicating its functionality while sidestepping IBM's copyright.

http://www.northerndtool.com

Check out the computer section. Once you see the PDF's you'll see what I mean about IBM not caring and probably wanting clone makers.

IBM gave the BIOS code in those documents, no need to recreate it, hence the lawsuits and Compact doing their 2 stage development of their own patent infringement free BIOS.

Dimitri

Edited: 5 May 2010, 4:34 p.m.


#29

Quote:
IBM gave the BIOS code in those documents, no need to recreate it...

Exactly the same point I was making, Dimitri, though not as as clearly. I do remember those Technical Reference manuals, which basically told third party developers or competitors all thy needed to know to design "PC Compatible". I had a TRM for the PCjr.
#30

Quote:
IBM gave the BIOS code in those documents, no need to recreate it, hence the lawsuits and Compact doing their 2 stage development of their own patent infringement free BIOS.

I haven't read those documents yet, but I'm having a hard time parsing the sentence above.

"IBM gave the BIOS code in those documents, no need to recreate it" -- so are you saying that clone makers could just use IBM's BIOS code? But then you say "hence the lawsuits and Compact [sic -- did you mean Compaq?] doing their 2 stage development of their own patent infringement free BIOS" -- so clone makers apparently *did* need to re-engineer the BIOS? I can't figure out what point you're trying to make here.


#31

Quote:
"IBM gave the BIOS code in those documents, no need to recreate it" -- so are you saying that clone makers could just use IBM's BIOS code?

I still have the blue cloth-bound 3-ring "IBM Personal Computer Hardware Reference Library - Technical Reference" in its slipcase on my bookshelf. Complete circuit diagrams for the PC, MDA and CGA cards, etc. - even the Epson MX-80 printer they rebadged - and yes, it has the BIOS source in it. The essential point is that, having published the source code in paper form, legally there was no doubt that it was copyright - something that was contentious with regard to program code at that time (I remember being an expert witness in a software copyright case a year or two later, and it was hell trying to explain this stuff to the barristers involved).

So, companies wishing to create a BIOS had to have two teams of coders. One would use the IBM BIOS to write a specification for an abstract BIOS: must have a routine at address such-and-such which takes these parameters in these registers and returns a result in this register, etc. . .

The second team had to have no IBM or IBM-compatible PC experience at all and be able to stand up in court and swear to such - they were called "virgins". They were passed the spec through a "Chinese wall" and set to work coding, whereupon the first team tested the results and refined their spec, then sent code back for rework. The result was a BIOS that was IBM-compatible but written by programmers who had never seen the IBM BIOS, and hence could not be a breach of copyright.

Best,

--- Les

[http://www.lesbell.com.au]

Edited: 6 May 2010, 8:46 a.m.


#32

Hmmm... So IBM published the BIOS source code, but not the BIOS API specification? I never understood what the deal was with this "clean room development", but if you're forced to read the code in order to re-implement the API, I can see that you might have to work that way.


It makes me wonder, though. Has anyone ever actually been convicted of software copyright infringement based on the fact that they were familiar with the original work, but *without* any evidence of plagiarism (i.e. no suspicious similarities at the source code level)?


The question is not entirely academic; I'm wondering about things like Free42 which re-creates something that I *am* familiar with, so I'm definitely not a "virgin" in the sense of your post. I could argue that I have never studied the HP-42S ROM, although I obviously did look at it long enough to extract the two screen fonts from it (*) -- but my gut feeling is that it the onus should be on the prosecution to prove that I copied their work, not on me to prove that I didn't, much less that I *couldn't* have. (Clearly, I am not a lawyer, I have never even set foot inside a courtroom.)

(*) It is my understanding that low-resolution bitmapped fonts are not protected by copyright, but that's another one of those things that's probably not as clear as it should be, and dependent on which jurisdiction you're in, etc.


#33

Quote:
It makes me wonder, though. Has anyone ever actually been convicted of software copyright infringement based on the fact that they were familiar with the original work, but *without* any evidence of plagiarism (i.e. no suspicious similarities at the source code level)?

Biggest problem with the PC BIOS as I understand it is that there are only so many ways to do the same functions in assembly, and that even people without any knowledge of the original code, but as good programmers will end up with similar code.

If any lines or block of lines are the same which would make sense, seeing as there are only so many ways to cut a sandwich. And US copyright "May not be reproduced in whole or in part" could still land them in legal trouble with IBM.

So by distancing the formation of the code like they did was a attempt to protect themselves as they did their due diligence to create legal code.

Dimitri

#34

Thomas, The point I am getting, particularly after reading Les' post, is that the publishing of the BIOS code did two things: it invited third party software and peripheral makers to jump in to the PC marketplace, and it meant that clone makers HAD to be extremely diligent in documenting their reverse-engineering work, or face lawsuits.

#35

Quote:
I can't figure out what point you're trying to make here.

Les covered it well. The reason the IBM BIOS was easy to reproduce is because the BIOS code was written on paper.

Many of the first clone makers simply copied the BIOS code and found themselves in hot water from IBM's legal team.

Compaq, and others at the same time developed clean room BIOS software where they have one team document every command entered and outputted in the IBM BIOS but without copying any of the code. And a second team develops the code to give the right input and output commands.

Today however companies specialize in BIOS writing. Phoenix is probably by far the oldest of such companies that also used a clean room method to develop a IBM BIOS. Which they turned around in 1984 and sold to any clone maker that wanted it.

PS Compaq, not Compact, sorry I spell words phonetically at times.

Dimitri

Edited: 6 May 2010, 10:16 a.m.


#36

Dimitri's recollection of the early PC BIOS situation most closely matches my own. IBM's PC team surprisingly published the technical details of the PC's hardware design and BIOS firmware in the technical manual. (That's the only cloth, slipcased manual I've kept from those early days.) Early PC copiers simply replicated the BIOS byte by byte and ended up in lawsuits with IBM as a result, prompting other copiers to develop the cleanroom approach. It's easy to prove copying in court when the BIOS ROMs match byte by byte. I do not recall the cleanroom approach to software/firmware replication ever being done before; There was no high-volume ROM product worth copying before the PC BIOS. However, most PC development history tends to repeat minicomputer development history, which also tends to repeat mainframe development history. Something about IBM, Amdahl, and Fujitsu sticks in my head but I really didn't follow the big-iron stuff as closely as PC stuff.

At first, the PC cloners did their own cleanroom development and there were elaborate descriptions of the lengths the cloners went to when trying to produce "clean" BIOS code, but there was an immediate and obvious product niche quickly filled by startup companies such as Phoenix (as Dimitri writes) and Award. Once the productized PC BIOSes were on the market for purchase by PC makers, the cloners bought these BIOS ROMs to get lower-cost access to both the code and the liability protection.


Possibly Related Threads...
Thread Author Replies Views Last Post
  HP Prime - definitely no education tool bluesun08 5 330 11-17-2013, 11:35 AM
Last Post: Alberto Candel
  "HexZombie - a tool for real programmers" Thomas Chrapkiewicz 8 559 11-16-2013, 12:46 AM
Last Post: Kiyoshi Akima
  HP 41: new barcode creation tool MichaelG 50 2,346 07-17-2013, 02:09 AM
Last Post: Ángel Martin
  wp34s library tool Andrew Nikitin 3 294 07-11-2013, 03:45 AM
Last Post: Marcus von Cube, Germany
  OT: Forth on Sharp PC-1500 Bill (Smithville, NJ) 3 295 12-10-2012, 12:53 PM
Last Post: Bill (Smithville, NJ)
  [OT] Sharp PC-1212 and CE-121 Alexander Oestert 9 504 04-29-2012, 02:17 AM
Last Post: Alexander Oestert
  [WP34S] Mac OS Flashing Tool Won't Flash Les Wright 2 287 03-23-2012, 03:59 AM
Last Post: pascal_meheut
  OT: Sharp PC-1600 Cassette Files Bill (Smithville, NJ) 0 148 03-18-2012, 05:58 PM
Last Post: Bill (Smithville, NJ)
  32-bit MCODE tool chain for the HP41 incl. D41 now MichaelG 4 314 02-12-2012, 08:57 PM
Last Post: Kerem Kapkin (Silicon Valley, CA)
  Modified Flash Tool for WP 34S Marcus von Cube, Germany 5 353 10-04-2011, 12:17 PM
Last Post: Marcus von Cube, Germany

Forum Jump: