Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Fatware
#1
What is the deal with the new wave of fatware? Is it just that companies want to release software as quickly as possible so they don't bother optimizing for space? In the good old days of computers when programmers only had 640K to work with there was a lot of effort spent figuring out how to get a complicated program to fit in that space. Now we have 2 or 4 GB of RAM they're being way to liberal with that. I remember having a 4MB machine that ran MS Word 6.0. I look at the newest version of Word and see some differences, but the basic functions are all the same. How did it go from taking up less than 4MB to half a gig? I know that there are a lot more functions now, but there really can't be THAT many.

Apple's guilty of making fatware too. Claris Works used to take up 4MB of RAM too, and that word processing, spreadsheets, painting, drawing, and everything else. Pages now comes to 896MB on the disk. Linux programmers still seem to have fairly small programs.

I blame the compilers more than the programmers. They're most likely using C++ because it is easier to make large programs, and easier to debug. There needs to be much more research done in making compilers that optimize code and remove pieces that don't need to be compiled. Anything that can be written in C++ can be written in C given enough time, and the C program will be much much smaller and much much faster. We just have to figure out how to make compilers that think like converters more than compilers.

Sorry, I was downloading a program last night that was 400MB and it pissed me off. So I ranted about it. Comment or flame away.
f you play a Microsoft CD backwards you can hear demonic voices. The scary part is that if you play it forwards it installs Windows.
Reply
#2
C++ programs are either just about the same speed or faster than C programs. Blitz or marzec probably knows more, since they're compiler geeks.

The very first C++ compilers just converted C++ code into C and then fed them through a C compiler. So writing a converter would be a step backwards.

The extra fat in programs is from the extra complexity and/or more detailed data files. If you check the size of the binaries in modern applications, they're actually pretty small.

Code:
[chris@opium code]$ ls -al /usr/bin | sort -n -k 5 | tail -n 5
-rwxr-xr-x  1 root root    3736852 Sep  9 18:55 Xvfb
-rwxr-xr-x  1 root root    4138112 May 30 13:09 gs
-rwxr-xr-x  1 root root    5354556 Aug 30 22:54 php
-rwxr-xr-x  1 root root    5753196 Jun 27 11:33 mencoder
-rwxr-xr-x  1 root root    6390412 Jun 27 11:33 mplayer

The heaviest common executable I have is 6 megs - that's probably because it's statically linked with a bunch of other libraries (since it's a media player, that seems a fair assumption).

Personally I don't see the problem - for 99% of projects, space isn't an issue. We should be happy that we're free of those chains, not pining back to the days when we were shackled up.
img]http://www.cdsoft.co.uk/misc/shiftlynx.png[/img]
Reply
#3
I think both sides are relevant. Sure, I don't like seeing programs that are bloated all to hell but at the same time, I don't like being restricted as a developer. People would do good to do some better optimizations these days but if it ain't broke, don't fix it.
\__/)
(='.'=) Copy bunny into your signature to
(")_(") help him gain world domination.
Reply
#4
It was easier to make programs smaller back in the day because you had fewer programmers working on a project. If you had to do some wierd tricks to reduce the size it wasn't to hard to explain it to everybody on the development team. Now that programs are much larger far more developers are involved and its is highly likely that the many of the developers (if not the entire team in some cases) will change during a projects lifetime. Because the projects are so large individual developers cannot hope to understand all of the code in a project anymore. Therefore companies now focus on creating maintainable and well tested code with well defined APIs rather than using clever tricks etc to reduce the amount of memory taken. Additionally, many programs these days are designed to be portable (even if just between multiple versions of the Windows API) making os/arch specific hacks a bad idea.

Of lot of older programs used to take more memory than was available and did their own form of memory management (such as using link overlays) to swap parts of the program in and out while it was running. These days there is no need to do this since modern operating systems provide paged virtual memory.

It is more difficult with modern operating systems to tell the actual size of a program in memory. Windows and Linux for example both show the memory usage of a process as the total memory being used by both the process and any libraries it is dynamically linked with. A single library (such as libc) may be linked with many processes meaning that it there will only be a single copy in memory, but it appears in the size counts for many programs. It is possible on Windows and Linux for the memory usage (working set) totals to exceed the amount of physical memory you have for this reason.

As for the compiler part (my Masters research is in compiler theory) transfering from C++ to C is likely to do more harm than good. Most of the comercial compilers use the same backend for both C and C++ anyway so you would be getting exactly the same optimisations. Compiler optimisation is a highly reasearched (and very difficult) field. Some of the optimisations done by compilers are pretty insane. One of the major problems with compiler optimisations is that many of the algorithms are either NP complete or have very large theoritical complexities meaning that shortcuts and heuristics need to be found if the compilation process is ever going to finish in a realistic amount of time.
esus saves.... Passes to Moses, shoots, he scores!
Reply
#5
Since you have a masters in compiler theory I have a question: most libraries in object-oriented programming have most of the functions overloaded. I know that one I write a library I overload the shit out of almost every function. Some of those over-loads are never used. Do compiler sort out which functions are being used and which owns aren't, or is a compiling every function? I'm just curious.
f you play a Microsoft CD backwards you can hear demonic voices. The scary part is that if you play it forwards it installs Windows.
Reply
#6
Yes.

The overload is not code-wise most of the time. Most space is taken by flashy graphics and stupid sounds... But they are so cool and my hard disk is big.
SCUMM (the band) on Myspace!
ComputerEmuzone Games Studio
underBASIC, homegrown musicians
[img]http://www.ojodepez-fanzine.net/almacen/yoghourtslover.png[/i
Reply
#7
If you want a library to be linked dynamically then you cannot determine which functions will be called at compile time, therefore you need to compile all of them. If a library is statically linked then you may be able to remove some unused functions. The virtual memory system in modern operating systems means that only the parts of a dynamic library that you are actually using are likely to be loaded into memory.

Like na_th_an said its not the code that takes up huge amounts of space, its all the other stuff. Lots of things in computer science have a memory space/execution speed tradeoff. Hard disk space and ram are cheap and plentiful these days so it often makes sense to go with the latter.
esus saves.... Passes to Moses, shoots, he scores!
Reply
#8
Besides, people who complain about bloatness of nowadays software and operating systems and miss the simplicity of MSDOS running in 256K of RAM fail to realize that nowadays you want hot-plugging stuff into your computer, you want to manage HUGE hard disk drives, you want on-the-fly hardware detection, fast internet, integrated multimedia, and many other things. Those things take space, chaps Wink

Do you think that, when you plug in your USB drive, it's detected, mounted and managed by magic?
SCUMM (the band) on Myspace!
ComputerEmuzone Games Studio
underBASIC, homegrown musicians
[img]http://www.ojodepez-fanzine.net/almacen/yoghourtslover.png[/i
Reply
#9
I'm not really complaining about operating systems and plug and play drivers. They are neccessary and don't take up a whole lot of space. Runtime doesn't bother me much at all, I have a gig of RAM and UNIX takes care of all of that for me, I've never had to use virtual memory. I hate downloading massive overfeatured projects. Graphics are cool, so I don't really mind when it comes to games and such. My beef is with: Apple Pages, MS Office, and the worse one: .NET! :evil: They only have RAM footprints around 30MB but the download sizes are crazy. I'll bet that all 3 of those have functions that no one has ever used outside of testing.
f you play a Microsoft CD backwards you can hear demonic voices. The scary part is that if you play it forwards it installs Windows.
Reply
#10
Only time size pisses me off is when I have to download it(Stuck on f***ing dail up, cause I live in the middle of nowhere, forgotten about Tongue ).... Other than that, I already figure the size of graphic files, music, etc.... I know back when me and xteraco was working on small demos, the file size tripled when we added music, etc.... The actual code was rather small, kb wise.... o.O

Find a better compression for cool/sweet music, and graphics, then you can get smaller programs....

.... And old MS-DOS codes that were only 640kb? You remember those are like the lowest res, slower, aplications right? Tongue .... Most people like flashy curves, clean looking stuff.... Why do you think we have glowing cases, sexy looking cars, weird looking furniture..

Course you have those who like plain white cases, drive the old Model-T's, and have very plain furniture, but not everyone is like that....

:roll:
Kevin (x.t.r.GRAPHICS)

[Image: 11895-r.png]
Reply


Forum Jump:


Users browsing this thread: 1 Guest(s)