eRacks is set to begin testing a Linux Ubuntu based tablet. The 7 inch tablet would retail for around $200 and would include several key features missing from the Kindle Fire: including a micro SD slot, a built in camera, HDMI out and GPS support. If you’re in the market for a tablet, you may want to keep an eye on the eRacks website. An upcoming software update would make it possible for the tablet install and run Android apps.
Greg April 13th, 2012
Posted In: Uncategorized
We’ve had a long and arduous search for a usable resolution (at least 1366×768) portable netbook, that will run Ubuntu smoothly, and we’re pleased to report our findings! The MSI U230-040US netbook fulfills all our requirements without so much as a hiccup.
Most netbooks have a 1024 x 600 pixel display. This fails miserably with some applications that are designed for higher resolution, like Eclipse, for example. Working with Eclipse can be annoying enough, but in a lower resolution display, important fields in certain windows are unusable and almost completely hidden.
Portability is important and this system weighs in at 3.3 pounds. It’s got a good solid feel to it, and the display bends back to an angle of about 135 degrees. The keys are next to each other, not spaced out like the Sony Vaio. The netbook’s measurements are 11.71″(L) x 7.49″(D) x 0.55~1.22″(H).
This system passed all our tests and is available, as a complete dual boot system from eRacks, called the eRacks/CUMULUS. We’ve got Ubuntu and Windows 7 on this one.
The built-in Webcam is 1.3MP and works with Cheese Webcam Booth, both photo and video. There is a 4-in-1 Card Reader (XD/SD/MMC/MS), and three USB2.0 connections. Bluetooth is working without any special configurations.
I’ve set the processor for 800MHz at OnDemand. The other settings are Conservative, Performance, and Powersave with a higher 1.6GHz is available.
All-in-all this system gets top marks for usability and portability.
britta October 5th, 2010
Imagine having the technology to store your music, movies and pictures in a central location and to access them from anywhere in the house. Even better, imagine that you can do this with little cost for hardware and zero cost for software. Not only is it possible, it’s never been easier.
In order to be able to access your central media repository, you’ll need to connect your computers to a network. With wireless networking, you can cheaply connect your machines almost anywhere in the house without having to run any cable. For the minimal configuration of one server (your media repository) and one client (the system connected to your home theater that lets you actually use the media), one wireless router and two wireless adapters will do. Even better, if the wireless router sits near the server, you can directly connect the two via a cable, saving you the cost of one wireless adapter.
Nowadays, with storage so plentiful and CPUs that are so powerful, it really doesn’t take much money to get good results, which is fortunate for those of us who have been negatively impacted by the downward turn in our nation’s economy. On the server side, a modest Intel Core 2 Duo with anywhere from two to four gigabytes of memory will do, and with one terabyte hard drives falling below $100, you should be able to save even more money. On the client side, with the new Intel Atom CPU, which is powerful, compact, quiet and highly energy efficient, you can build a thin client that sits snuggly atop your entertainment center.
Media center software has grown increasingly popular, and the open source movement has kept up nicely with easy to install, easy to use applications.
For the operating system on both the client and server side, you have a plethora of Linux distributions to choose from, Ubuntu (http://www.ubuntu.com/) being our recommendation. Then, on the server side, you’d simply have to configure your system to share your files over the network. On the client side, applications for managing your media include XBMC (http://xbmc.org/), Elisa (http://elisa.fluendo.com/), Entertainer (http://www.entertainer-project.com/) and MythTV (http://www.mythtv.org/ — note that MythTV is a little more involved with regards to configuration and has components that must run on the server side.)
With hardware becoming cheaper and more powerful, and with the added bonus of using free software, a capable home entertainment system can be had for a minimal investment. And, of course, eRacks specializes in providing its customers with the resources they need, whether it be selling systems pre-configured to your specifications or offering consulting for more difficult projects. Contact eRacks today and find out what we can do for your home!
james March 24th, 2009
I will relate a recent battle I had with a laptop that uses the Prism54 wireless chipset and runs Fedora 10. For quite some time, I could not get it to connect to a WPA protected network. With an open network, it would connect just fine. I didn’t bother with WEP. I wanted to find out what was causing it to fail with WPA.
This is an older eRacks CENTRINO laptop (Pentium M 1.6ghz, 1GB RAM and an 80GB hard drive.) This post will also hopefully help anyone else who has a laptop with the Prism54 chipset (mine specifically is a PrismGT mini-pci card.) Note that Prism54 is also available in PCI and USB wireless devices.
At first, I thought it might be a problem with the GNOME NetworkManager. So, I tried other methods of connecting, such as using the command line (for iwconfig/ifconfig), wicd, Wireless Assistant and WiFi Radar. Some of these seem to work better than others, but again, none would allow me to connect to my WPA protected network at home. Thus, it was time to dig deeper.
After some sifting through forum posts, blogs, and bugzilla, I finally came across something that might help. Apparently, the prism54 drivers have several different modules that are loaded. For some reason, there is a module (prism54), which might be an older version of the complete set, and then there are other separate ones: p54common, p54pci and p54usb. So in my case, it was loading prism54, p54common, and p54pci. According to what I have read, the prism54 module causes conflicts with the newer p54common and p54pci set. The suggestion for now is to add prism54 to the module blacklist, located in /etc/modprobe.d/blacklist. You add the following entry at the bottom:
Once I did this and restarted networking, I could connect to my WPA-protected network using the default GNOME NetworkManager. All is well again in WiFi land.
Hopefully, this little jaunt with prism54 will be able to help someone else.
Matt March 13th, 2009
Here are 10 really useful reasons to justify why you need a new Linux Netbook from eRacks.
Besides, a contributing member of this technological society is required to stay well-connected at all times. And in this economy, cost-effectiveness is imperative.
britta January 6th, 2009
There may be situations where you’d like to login to a remote machine via SSH and not have to enter a password to do it. Perhaps you have some sort of automated file transfer that makes use of SCP. Or, perhaps you frequently login to the same machine and get tired of having to enter a password each time. Whatever the reason may be, an attractive alternative to using passwords involves making use of cryptographic keys.
To give you a general idea of what’s involved, you’ll first generate a public/private key pair. Your public key is what you would copy to every machine you want to be able to log into. You can think of the public key as the lock on a door to a house. The reason why we call this a public key is that it’s safe to share it with the public, just as the lock on your door is safe to display from the outside. By contrast, your private key can be thought of as the key that fits into the lock. Unlike your public key, you should never copy it to machines that are either untrusted or to machines that you yourself don’t administer — this would be a bit like placing the key to your front door outside your house for strangers to use! Anybody who possesses your private key can access every machine to which you’ve made your public key accessible, so exercise extreme caution and guard your private key with your life.
SSH makes generating your keys very simple. From the command line, you’ll simply enter the following command:
You’ll then be asked a series of questions. Accept all the defaults. If you don’t desire to password protect your key pair (which would require you to enter a password when you use it), hit enter when asked for the password, without typing anything in. At the end of the process, you should discover two new files in ~/.ssh, id_rsa and id_rsa.pub, where ~ stands for your home directory. From here, you’ll copy your public key (id_rsa.pub) to every machine you wish to log into and append its contents to a file called ~/.ssh/authorized_keys, where ~ stands for the home directory of the account you wish to log into.
To test your newly generated key pair, try to connect to one or more of the remote machines you copied your public key to. You’ll find that you’re sent straight to a command prompt, without the need for a password.
Now, there are situations where using keys without passwords can potentially be hazardous, so some significant thought should be given to the circumstances in which your key pair will be used. For example, I will never copy my laptop’s public key to my personal server at home, because if my laptop is ever stolen, the thief (if he knew how to use *NIX) would not only have access to all my local data, but would also have complete SSH access to my network at home, since he would have my laptop’s private key. Thus, I choose to sacrifice convenience for security in that particular situation. As with all things in life, the amount of relative security versus convenience is a trade off, so make sure you choose wisely.
james November 21st, 2008
This article is geared toward eRacks customers who have a desktop or laptop system, i.e. a personal workstation. It is not intended to serve as a guide for customers wishing to upgrade a server.
With the above in mind, for those who use Linux on such a machine, your choice of distributions that cater to this niche is growing nicely. You have the “Big Boys” such as Ubuntu, Fedora, Mandriva or OpenSUSE, as well as a host of more specialized distributions, the main focus of most being on user friendliness and “up-to-dateness.” What this usually leads to is a faster upgrade cycle than what you would typically find on a server oriented distro such as Debian (stable), RedHat Enterprise, SuSE Enterprise or CentOS.
I myself have been tracking RedHat (including Fedora) since version 5.0, doing a mix of upgrades and fresh installs. I have also kept up with Ubuntu since 6.04, and have had similar experiences with it. I have found that one way of making regular upgrades easier is to keep a separate /home partition. This way, you have a choice of an upgrade or a fresh install, without losing valuable data.
My experience, and that of many other salty seasoned Linux gurus, is that upgrading from a previous version tends to be a bit messier and usually takes longer to do than a fresh install. This can be true, especially if you use third party repositories, if you install software not maintained by your distro package manager (DEB or RPM) or if you do a lot of tweaking. Doing so may leave you looking at a broken system when the upgrade finishes. For this reason, it is usually more desirable to do a clean installation and install your third party applications afterward.
How then to keep from losing your data? Many system admins would suggest the multiple partition method, which has been used on servers a lot, yet not so much on the desktop. The multiple partition method can have advantages and disadvantages, but since hard drives are so big these days, many of the disadvantages are no longer prevalent.
While most modern desktop distros have a default partitioning scheme that gives you just a swap partition (usually about 2x the amount of RAM, or physical memory) and a large root partition for everything else, most server configurations have multiple partitions for directories like /usr or /var, which can have many advantages. For example: if you wanted to have /usr mounted as read-only to prevent unauthorized system-wide software installs, if you wanted to keep /boot separate for a RAID array or if you wanted to keep /var and /tmp separate to avoid corrupting the core system files; these are all examples of why one might want to make use of multiple partitions. In this case, however, the partitioning must be very carefully planned according to the intended use of the server, what programs need to be installed, how many users will be logging in, etc.
Luckily, there is a happy medium that works well for desktops, and that is to use a swap partition with 2x the amount of RAM, a root partition for your operating system and a very large /home partition for all your data. When you do a fresh install, all you have to do is make sure you don’t format /home, and your data will be safe across installations. If you want to save any system-wide tweaks, you will, of course, also have to backup important configuration files and check them against their replacements, making changes where necessary.
In my case, I have a 120GB hard drive for Linux, which makes use of the following partition scheme:
14GB “other” (at times it has a Gentoo install, other times it has FreeBSD, depends on my mood…)
I have found through experience that this setup works well.
When I do an OS update, such as my recent one to Fedora 9, I usually backup important configuration files to /home, do a fresh install and finally install any third party programs I need.
In the past, when upgrading systems without doing a fresh install, things for me have tended to get rather wonky. However, I have recently tried upgrading Ubuntu, and I must say that the recently improved Upgrade Manager, a graphical front end to the apt-get dist-upgrade functionality, is a nice touch. It allows you to upgrade to the next version of Ubuntu, while still allowing you to run your system so you can go about your business as it downloads and installs all the packages. When it’s done, you simply reboot, and voila, new version! Upgrades on Fedora, by contrast, are still usually done by the tried and true method of booting the install disk and running the upgrade procedure. Fedora does have the capability to do upgrades using the yum package manager, but that functionality isn’t as mature as apt-get dist-upgrade, and thus is not for the faint of heart.
So now, what if you have an existing Linux installation utilizing only a single partition and you want to do a fresh install while keeping your data safe?
Of course, you could just back your data up to a large external hard drive, but not everyone has one at their disposal. In this case, what you could try is resizing your root partition, create a new partition for /home and copy your personal data to it before starting the upgrade. Then, just run through the installation as usual. This is, of course, only if you have enough space to resize. If not, you may still require an external drive, at least temporarily, to copy your data to before starting the installer.
If you want to make use of multiple partitions on a new eRacks system purchase, just ask for it during your order. This way, your system will be ready when the next OS update rolls around!
Matt June 27th, 2008
Hello everyone out in the blogosphere (Look my vocabulary improved!) Allow me to introduce myself. I am Max, the Op Manager here at eRacks. Now that that’s out of the way, lets dig in!
I recently had the chance to lay my grubby mits on the latest ASUS eeePC, and am here to give my initial impressions. Now mind you, I am a very busy man (Darn starbucks being so far away!), so I only had a couple hours to play with this little PC, and I must say, I am impressed.
Now, Asus keeps a tight grip on the distribution of their eeePCs, and makes sure they get there asking price, so shopping around won’t net more than about a dollar in savings. I will chalk that up on the bad side of things. However, while a little on the high end of the price scale, for its functionality, let me tell you something that makes up for that 100x: it works flawlessly, it’s quick, and it gets a lot of looks (ladies, forget the new hairstyle. Pick up one of these bad boys and prepare for the geek onslaught!) The fact that I had no issues with it speaks volumes, because I always break something and have to have Tony, our Head Tech, come and save me.
It also comes with all the software you would need: open source applications, games, and media playing programs preinstalled and ready to go. You do have to sit through a quick registration screen at first to get to this, but hey, you have to do that with everything. When I started this mini-beast up, I was pleased to see that everything displayed quite nicely on the 7″ 800×480 res. screen. It even comes with a pretty nice Intel graphics chipset to boot. So, as far as visuals go, while you wont be seeing HD style graphics, you will get a clear, precise picture that makes working on it pretty easy. Not bad ASUS, not bad… But you could, ya know, boost the res up to maybe 1200? Maybe…please? C’mon…
Anyway, this is not by any means a replacement for a full fledged laptop, but it is a nice miniPC that will come in handy for a quick write up at a trade show, a place to store a few pictures, a checking of websites or emails from the airport or any number of road-warrior-like activities. The other thing it’s good for is KIDS! Kids love it; it comes in multiple colors, it plays games, it’s small, it’s neat, it makes noise on its 5.1 realtek HD sound card, it plays music AND it’s cool looking. The only problem I see with kids and this is that on the models we got, the keyboard is white (wash your hands, children, before touching it), so beware of dirty fingers! We actually had a customer call us and let us know that their children were hammering away on these things and that they stood the test of time (at that point, 1 week. But hey, it’s a miniPC and a child. Thats like platinum record status!) Another good feature is the card reader. This allows you to store plenty of files on the SD cards. Neat!
The few bad things I have to say are as follows: it only has 2 hours of battery life (I know, I know; laptops and such do not have amazing battery lives, but 2 hours?! I’ve had layovers longer then that on flights from OC to SF); it has no DVD or CD player, which is a bummer, even though I do understand that it’s a different category of PC — I still want to be able to throw a DVD in or listen to a CD I just bought (ok, that may be a lie; who really buys CD’s anymore, anyone? I admit it. I do. MP3’s be damned!); the graphics could be a bit better and the white keyboard is a parents nightmare, although at least the keys are stuck close enough together that food can’t hide in them. Overall, there weren’t enough bad things to warrant a bad review, or to take away from the coolness factor.
In closing, I know this isn’t as in-depth or as technical as some people would like. But hey, I’m Max, and Max is allowed to write what he wants (you love the 3rd person, I know it!) Overall, I give this 4/5 stars for a mini pc on coolness factor, and 3.5/5 on tech factor. Take my opinion with a grain of salt though, for I am just an Op Manager doing my thing.
For the techies, here’s a rundown of the specs:
Visit www.eRacks.com for more info.
max May 2nd, 2008
Posted In: New products