Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Hex Codes???
#11
Yeah, you're quite right, Nathan. Many times you patched programs on the fly to be able to continue testing. Could take several days to:
1) Code the changes on coding sheets.

2) Send them to the keypunch department to be punched onto punched cards, or finding an available key punch machine (IBM 027 or 029) to punch the changes yourself. BTW, in the USA, source code was still punched on cards in the 1970's and maybe later in other countries.

3) Ok, got the new cards, now merge them by hand into your original source code deck of cards.

4) Take the source deck to the data center, fill out a request, and submit for assembly or compilation, depending on the language.

5) Keep coming back to the data center to see if your assembly or compile finished. If so, look at the listing to see if there were any errors. If you had errors you would have to start the whole process over, or decide if the error was patchable.

6) Ok, now you had an object deck (executable program on cards).

7) Now you had to go beg for some maching time to test your program. If you were lucky, they allowed you 15 minutes of computer time at about 10PM.

8) As a result, you did a tremendous amount of desk-checking of your program code, and subsequent dumps of your input/output files and sometimes memory dumps.

BTW, can't imagine fixing (patching) an entire BIOS. You really have to know the machine by heart.

What you found in the corner under a dusty sheet, could not have been an IBM 360. Just the CPU was about the size of 3 large refrigerators. Then it had other refrigerator-size components like disk and tape controllers, and the tape and disk drives themselves. Maybe what you found was the console. The problem with old stuff lasting is that you can't get parts or service anymore, plus operating systems and maintenance support either. It would become a hobby like restoring an old 1955 Mercedes Benz 300SL gullwing (my favorite car).
*****
Reply
#12
Wow, interesting stuff... It should have to be a true bummer Tongue And I took my hands to my head when I learned how in 1983 one of my favourite game programmers coded his masterpiece assembling his Z80 assembly code by hand on a notebook 'cause he couldn't afford an assembler...

Man, now I understand why we the computer scientist have that bad reputation Big Grin people in the 60s/70s should like this stuff a lot to work and develop on it. I think this was really a love affair with computers, if it wasn't, I can't understand how people survived those processes.

Yeah, what I saw should be just a part, 'cause I've been researching and I've seen pictures of a system that took a whole room. I can't imagine what did it cost (taking in account that the first portable computer from IBM in 1975-1978 costed $15,000 on average Tongue).

Something I've always wondered is the speed of those computers. They didn't have integrated microprocesors until mid 70s, so they used cabled logic for the CPU. I suppose the speed was in the order of some KHz, but... can you tell? I'm really interested. I know the first microprocessor, the Intel 4004 (4 bits marvel from 1971 which gathered in less than two inches more power than the whole ENIAC) ran at 100 KHz more or less, so I guess it should fall way below, considering the speed increase of conmutation in integrated transistors.
SCUMM (the band) on Myspace!
ComputerEmuzone Games Studio
underBASIC, homegrown musicians
[img]http://www.ojodepez-fanzine.net/almacen/yoghourtslover.png[/i
Reply
#13
Quote:...Something I've always wondered is the speed of those computers. They didn't have integrated microprocesors until mid 70s, so they used cabled logic for the CPU. I suppose the speed was in the order of some KHz, but... can you tell? I'm really interested. I know the first microprocessor, the Intel 4004 (4 bits marvel from 1971 which gathered in less than two inches more power than the whole ENIAC) ran at 100 KHz more or less, so I guess it should fall way below, considering the speed increase of conmutation in integrated transistors.
Nathan, Somewhere in one of my boxes of old computer stuff, I have a chart of mainframe processor speeds from about 1960 to 1980. I'll try to find it. Maybe there's some info on the Internet.

Anyway, I can remember the speed of the IBM 1401, which was the first second generation machine, which now used transistors. The memory was core. The standard model ran at 11.5 microseconds cycle time. Machines then were measured on cycle time, not clock speed. Most instructions took 2-4 cycles on the average. You added up all the cycles of the instructions to determined the time a routine would take. I/O time was a big factor. The 1401 had no programable interrupts and no buffered I/O, so when you read a record from tape, your program sat there at the read instruction until it finished. The card reader/punch and printer were overlapped because they did the I/O from fixed memory locations.

Later on the IBM 360, which I think ran at about 2 microseconds per cycle (I'm guessing), there were interrupts, buffered I/O, channel procedssors for I/O, and micro-code for executing the sofisticated instruction set. These babies not only were faster than the 1401, but they ran under real operating systems (TOS, DOS, OS), and they were in the same price range as a 1401, plus they could run 1401 programs under emulation. They sold like hot cakes! The IBM 370's and 390's are built on the same basic architecture, and the instruction set is basically the same, expanded.

When the minicomputers became popular in the in the early 1970's, several of them had broken the microsecond barrier and were running at speeds like 950 nanosecond cycle time. Considering that they had very simple (almost RISC) instruction sets, this was fast! However, to keep costs down, many of these minis used "cheap" disk and tape drives, which were pretty slow. The printers were also "cheap". Mainframes like the 360 had the largest and fastest disks, tapes, and printers on the market. Hey, they went along with the price tag.

So, taking all this into consideration, a 2 microsecond IBM 360 could run a hell of a lot faster (throughput) than a 950 nanosecond mini. To some extent this also holds true for comparing todays super fast PC's against apparently slower mainframes. Somebody once said "You get what you pay for".

P.S. I just saw the picture of the 360 you posted. That was the IBM 360/20, a stripped-down, smaller version that they sold to capture the low end of the market. It was a real pig. When I talk about the 360, I'm referring to the 360/30 and up.
*****
Reply


Forum Jump:


Users browsing this thread: 1 Guest(s)