computer shopping

Place to discuss anything, almost. No politics, religion, Microsoft, or anything else that I (the nazi censor) deem inappropriate.
worker201
guru
guru
Posts: 668
Joined: Sun Jun 13, 2004 6:38 pm
Location: Hawaii

computer shopping

Post by worker201 » Wed Jan 03, 2007 4:14 pm

I want a laptop computer I can install Linux on. It needs to have at least a GB of RAM, a decent processor, sound, and a graphics card that doesn't require 3rd party drivers. A wireless card that works out of the box would be nice too. Finally, I don't want to pay very much. I can get a Dell that meets my requirements for about $900. Anything cheaper out there?

Also, I was reading about large file support recently. My understanding is that there is a 2GB limit on 32bit Linux, although this can be worked around (see question 14 in this article). I need to be able to work with large files. Does this mean I should get a 64bit computer? I was thinking of running Slackware, is there a 64bit Slack out there, or would I have to run Slack32? If I have a 64bit processor and use a 32bit OS, does that pretty much defeat the purpose? As you can see, I could use a little help straightening out this whole 32/64 thing.

User avatar
Void Main
Site Admin
Site Admin
Posts: 5716
Joined: Wed Jan 08, 2003 5:24 am
Location: Tuxville, USA
Contact:

Post by Void Main » Wed Jan 03, 2007 4:44 pm

Actually Linux (32 bit version) has supported files larger than 2GB for quite some time. The last time I remember a 2GB limit was either in kernel version 2.0.x or 2.2.x. I am pretty sure by the time 2.4.x things were set up for > 2GB by default. At any rate you should have no problem with this on any recent distro (running 2.4.x or 2.6.x kernels). It would be very hard to work with DVD images if this is was not the case.

Regarding a cheap (low cost) laptop, I have recently purchased two HP laptops for my kids that work very well. The first one I bought last Christmas on Black Friday special deals from Wal-Mart and I purchased another one this past Thanksgiving on a similar deal. I paid less than $400 for both of them. The first one only had 256MB of RAM and a 40GB drive but the second one had 512MB and 60GB drive and a nice widescreen display. Both have built in wireless (802.11G) and I was able to get them both working with ndiswrapper and WPA encryption after a certain amount of fiddling. I'm running FC5 on one and FC6 on the other.

I have an IBM Thinkpad with 1.5GB of RAM in it I use for work (I'm typing this right now on it) and it works well with Linux but I don't think you will find one of these for $900. I also do have a few older Dell laptops at home that I am running Fedora Core 5 on. I am surprised that a new Dell will run out of the box including Wireless. My experience was that support for the most recent Dell laptops wesn't great but that may have changed. Just posting some of my experiences in case it may help.

worker201
guru
guru
Posts: 668
Joined: Sun Jun 13, 2004 6:38 pm
Location: Hawaii

Post by worker201 » Wed Jan 03, 2007 5:22 pm

We use Dell laptops exclusively at work, so I have had the opportunity to explore their capabilities at length. It is true that low-priced Dell laptops are only available to home users with Dell Wireless 1390 mini-cards, which are not out-of-box compatible with Linux. However, small-business customers can get the optional Intel 3945 cards which are very Linux compatible. Also, small-business customers can get computers with built-in RS-232 serial ports, another option not available to home users.

However, these low-end Dells only have XGA screens, and I would kinda like to have something a little tougher. I'll check what kind of deals they have at some of the local stores, see if I can't find a good deal.

kasperd
n00b
n00b
Posts: 1
Joined: Fri Jan 05, 2007 7:51 pm
Contact:

Post by kasperd » Sat Jan 06, 2007 8:25 am

I am the author of the linked FAQ, and I just wanted to clarify on question number 14. The answer intended for developers who find that their programs does not work with files larger than 2GB by default. But using larger files on 32bit Linux is no problem as long as you just have the right defines in the begining of the program before any includes.

This is all aimed at developers. What this means for users is, that the most important programs on 32 bit Linux does work with large files by now. There is a lot of programs that rarely touch files larger than a few MB. And for those programs, you might never notice whether they support large files or not.

As a user you may run into a program which is typically used for files in the 100MB to 1GB range and be surprised to find it not working with files of 2GB or more. In that case you could blame the developer of the program, or you could blame the developers of glibc for not making large files supported by default. A much more productive approach would be to download the source, add a few defines in the right places, compile and thoroughly test the program on large files.

Thus as a user even if you need to work with files larger than 4GB, you might be able to do so on a 32 bit machine. However there are cases where you might still benefit from a 64 bit machine. If you have more than 896MB of RAM, memory management in IA32 Linux starts getting tricky. Older kernels didn't support it at all. But eventhough 2.4 and later can handle more RAM, you don't get full benefit of the RAM above 896MB. If you have 2GB or less, you probably shouldn't worry about this. If you have 4GB or more you definitely should worry. If you have that much RAM a 32 bit architecture will be too limited.

There is another reason why you might want a 64 bit architecture (even if you have less than 1GB of RAM). That is the virtual address space. Even if you don't have that much physical RAM, you might still have a need for that much virtual address space. There is a number of reasons why your virtual address space might need to be significantly larger than your physical RAM. If you have enough swap, a process might simply use more. In some cases the most efficient way to solve a problem might involve mapping the same physical RAM on a few different virtual addresses. Because of fragmentation you might not be able to use all of the avilable address space. Finally a program might have a need to mmap large files, in which case you will need address space for that. Combine the four, and a few GB of address space will not seem like a lot. There are tweaks which will help you get the most out of a limited address space. But in the end they will only buy you a little, and if your data grows, you might end up needing a 64 bit architecture anyway.

As a developer you should be able to figure out if your applications need more than the 3GB of address space they can typically get on a 32 bit architecture. As a user you might want to trust what the developer of the software tells you. If there is a particular piece of software you need to use to process large amounts of data, and the documentation doesn't make any recomendations about 32 vs. 64 bit architecture, I think you should go ahead and ask the developer what is recommended. (The answer might be that the application can benefit from a 64 bit architecture, but has received the most testing on a 32 bit architecture. In that case you will have to ask yourself whether you want to be the guniea pig and possibly gain the possibility to work with larger amounts of data. Or if you want to use a 32 bit architecture and hope you don't need to work data too large to be handled).

User avatar
Void Main
Site Admin
Site Admin
Posts: 5716
Joined: Wed Jan 08, 2003 5:24 am
Location: Tuxville, USA
Contact:

Post by Void Main » Sat Jan 06, 2007 9:06 am

Kasper, thank you for your input on this. I looked over your site and I find some very impressive things!

I got the feeling that worker was under the impression that large file support wasn't possible on current 32 bit Linux systems. You are very correct that you still have to write your program to take advantage of it. On a somewhat related note I have run into a few issues in the past when running software between architectures. For instance, I had a counter problem running the snmp server on Debian on a Sun Sparc (64 bit) because of differences in integer sizes. I created a patch for net-snmp to fix it and sent it in. It was just a hack and I hope the maintainers implemented a better solution:

http://voidmain.is-a-geek.net/forums/vi ... php?t=1036

JoeDude
administrator
administrator
Posts: 355
Joined: Sun Feb 08, 2004 1:41 pm
Location: Sutton Coldfield, UK
Contact:

Post by JoeDude » Sun Jan 07, 2007 4:30 am

Thank you for the information. I was wondering myself what many of the implications of 64 bit architecture would be. I probably haven't gone to the right places and asked/read, but for the most part, it seemed to me a lot of people really didn't know. That post actually cleared a lot up for me.

worker201
guru
guru
Posts: 668
Joined: Sun Jun 13, 2004 6:38 pm
Location: Hawaii

Post by worker201 » Mon Jan 08, 2007 2:07 pm

Awesome.

Aside from transcoding video and dvd production, 99% of all the files I process are under 1GB. So I don't really run into filesize limits all that often. In fact, I've only hit any sort of wall just once. I was using GMT to mathematically surface a 1.9GB file, and it just plain gave up - gave me a "ran out of memory" error. At the time, I was running FC3 (a 2.4 kernel, I think) on a P4 with 1.5GB RAM. However, since it was a geographical grid, it was pretty easy to break the file into smaller chunks and process them separately, and then put them back together at the end before producing the image.

But hey, you never know when I'm going to get another monster file.

Overall, though, it seems like the additional expense involved in getting a 64-bit processor in a laptop is a bit of an overkill at this point, especially since there are a few things that can be done to extend the capabilities of a 32-bit system for the average user and/or recreational developer.

Thanks a ton, though, for your input and advice.

User avatar
Void Main
Site Admin
Site Admin
Posts: 5716
Joined: Wed Jan 08, 2003 5:24 am
Location: Tuxville, USA
Contact:

Post by Void Main » Mon Jan 08, 2007 2:18 pm

worker201 wrote:I was using GMT to mathematically surface a 1.9GB file, and it just plain gave up - gave me a "ran out of memory" error. At the time, I was running FC3 (a 2.4 kernel, I think) on a P4 with 1.5GB RAM. However, since it was a geographical grid, it was pretty easy to break the file into smaller chunks and process them separately, and then put them back together at the end before producing the image.
But that is a *completely* separate issue and has nothing to do with file size limits. What happened there is your program ran out of memory (virtual/RAM not disk). So it has nothing really to do with the size of the file but how much memory your program needed to run. This can be solved several different ways. One is to do what you did and break the input files up into smaller sections, or two is to write the program to not hog so much memory, or three to get more RAM if it's a case of program needing more RAM/swap than you have available. It is also possible that system configuration limits are in place. There is a way to limit processes from using all memory so a user can't crash your system. Fedora doesn't implement these limits by default though. I would first watch the process in "top" and sort by memory usage and also watch your available memory and see if it consumes all of it to make sure you aren't hitting other limits.

worker201
guru
guru
Posts: 668
Joined: Sun Jun 13, 2004 6:38 pm
Location: Hawaii

Post by worker201 » Mon Jan 22, 2007 7:30 pm

Update:

I just ordered my new Linux machine: a Thinkpad T-60. A consultant came in the other day and had one, and I was pretty impressed with its size and weight - it made the Dell Latitudes and Inspirons look and feel clunky. I just had to have one. It was way more than $900, though. Could've been more - for some reason they are having an upgrade sale:

60GB 5400rpm HD ---> 60GB 7200rpm HD = free
CDRW/DVD-ROM ---> CDRW/DVDRW = free
6 cell battery ---> 9-cell battery = free

Further specs:
Intel Core 2 Duo processor T5500 (1.66GHz, 2MB L2, 667MHz FSB)
14.1 SXGA+ TFT LCD
Intel Graphics Media Accelerator 950
{an ATI card was available, but I wanted to go with open drivers}
1 GB PC2-5300 DDR2 SDRAM 667MHz
Intel PRO/Wireless 3945ABG
Bluetooth enabled

According to many reviews, I should have no problem installing Linux on this computer. I'm trying to decide if I want Slackware or if I want to try Gentoo.

User avatar
Void Main
Site Admin
Site Admin
Posts: 5716
Joined: Wed Jan 08, 2003 5:24 am
Location: Tuxville, USA
Contact:

Post by Void Main » Mon Jan 22, 2007 8:34 pm

That looks just like the one I am on now (mine is an R40). I think my HD is only 5400 though (a little slow) and it has a Pentium mobile 1.4Ghz. I have 1.25GB of Ram in it. You should be happy with it.

worker201
guru
guru
Posts: 668
Joined: Sun Jun 13, 2004 6:38 pm
Location: Hawaii

Post by worker201 » Tue Jan 23, 2007 12:07 pm

Oh, and shipping via UPS Ground was free, and upgrading to 2-day cost only $10. Guess I better start burning ISOs.

Actually, I have a question about that. I don't have a Linux machine available, what's the best way to burn ISOs? I have a Windows machine with Nero and Roxio available, and there's also my Mac. I assume either will work just fine, but I wanted to make sure.

Just for kicks, I went to the Dell website and shopped the specs of the Thinkpad. Nearly every feature of the Thinkpad is a costly upgrade on an Inspiron, especially the faster FSB, faster RAM, and faster HD. The Dell would have been about $100 more than the Thinkpad, and weigh nearly 2 pounds more! I think I got a pretty good deal. Just goes to show that good hardware is more expensive, no matter what brand of computer you get. Now, whether I need all that power, well that's another story...

User avatar
Void Main
Site Admin
Site Admin
Posts: 5716
Joined: Wed Jan 08, 2003 5:24 am
Location: Tuxville, USA
Contact:

Post by Void Main » Tue Jan 23, 2007 12:44 pm

I have used Nero without any problem. If I remember right there is a pulldown menu and I think there is an option to "Write Disk Image" or something similar.

worker201
guru
guru
Posts: 668
Joined: Sun Jun 13, 2004 6:38 pm
Location: Hawaii

Post by worker201 » Thu Feb 01, 2007 11:28 am

I was able to burn the Slackware isos in some Roxio program we have at work. I also found a tool that let me check md5 sums for each iso before burning. Installation will begin either tonight or this weekend.

The first stage of Linux ownership began last night - removing the MS Certificate of Authenticity from the bottom of the computer. That sticker is not meant to be moved, and it does not come off easily. But I bought a bottle of GooGone and applied it with an old sock. After a few minutes, the sticker could be scraped off with a fingernail. A couple more dabs of GooGone, and no trace remained. Same with the "Designed for Windows XP / Windows Vista Capable" sticker on the top.

For anyone who is interested in replacing the top sticker on their laptops, I found this site that sells stickers for just about any decent OS. You can also find replacements for your Intel/AMD sticker as well.

User avatar
Void Main
Site Admin
Site Admin
Posts: 5716
Joined: Wed Jan 08, 2003 5:24 am
Location: Tuxville, USA
Contact:

Post by Void Main » Thu Feb 01, 2007 1:02 pm

Yeah, ripping those stupid stickers off is the first thing I do when I get to any machine.

User avatar
Calum
guru
guru
Posts: 1349
Joined: Fri Jan 10, 2003 11:32 am
Location: Bonny Scotland
Contact:

Post by Calum » Fri Feb 02, 2007 9:07 am

removing the M$ Certificate of Authenticity from the bottom of the computer.
did you pay microsoft for a copy of windows then? or does the sticker just come free with the laptop? this may be a trick question by the way.

Post Reply