Categories
Geek / Technical

Automating Backups for Fun and Profit

At the end of last month, I mentioned that I had bought a new hard drive for the purposes of backing up my data. I just now installed it in my second computer. Was getting the backup system in place important? Yes, absolutely! I don’t think that a hard drive might fail. I know it WILL fail.

While I could just simply copy files from one drive to the other manually, it depends on me to do so regularly. I’m only human, and I can forget or get sick, resulting in potentially lost data. Computers are meant to do repetitive tasks really well, so why not automate the backup process? My presence won’t be necessary, so backups can take place even if I go on vacation for a few weeks or months. The backups can be scheduled to run when I won’t be using the computer. Copying lots of data while trying to write code, check email, and listen to music at the same time can be annoyingly slow, but if it happens while I am sleeping, it won’t affect my productivity at all. Also, while the computer can faithfully run the same steps each time successfully, I might mess up if I run the steps manually and in the wrong order. So with an automated system, my backups can be regularly recurring, convenient, and reliable. Much better than what I could do on my own week after week. I decided to make those concerns into my three goals for this system.

I’ll go over my plan, but first I’ll provide some background information.

Some Background Information
LauraGB is my main Debian GNU/Linux machine. MariaGB is currently my Windows machine. The reason for the female names is because I am more a computer enthusiast than a car enthusiast. People name their cars after women, so I thought it was appropriate to name my computers similarly. For the record, my car’s name is Caroline, but she doesn’t get nearly the same care as Laura or Maria. My initials make up the suffix, GB. I guess I am not very creative, as my blog, my old QBasic game review site, and my future shareware company will all have GB. Names can change of course, but now I see I am on a tangent, so let’s get back to the backup plan.

LauraGB has two hard drives. She can run on the 40GB drive, as it has the entire filesystem on it, but the 120GB drive acts as a huge repository for data that I would like to keep. Files like my reviews for Game Tunnel, the game downloads, my music collection, funny videos I’ve found online, etc. It’s a huge drive.

For the most part, I’ve simply had backup copies of data from the 40GB drive to the 120GB drive. I also had data from an old laptop on there. Then I started collecting files there, but they don’t have a second copy anywhere. If I lose that 120GB drive, I could recover some of the files, but there is no recovery for a LOT of data. Losing that drive spells doom.

At this point, MariaGB can become much more useful than its current role as my Games OS. With the new 160GB drive, I can now have at least two copies of any data I own.

The Backup System
I spent the past few weeks or months looking up information on automating backups. I wanted something simple and non-proprietary, and so I decided to go with standard tools like tar and cron. I found information on sites like About.com, IBM, and others. I’m also interested in automating builds for projects, and so I got a lot of ideas from the Creatures 3 project.

On Unix-based systems, there is a program called cron which allows you to automate certain programs to run at specific times. Each user on the system can create his/her own crontab file, and cron will check it and run the appropriate commands when specified. For example, here is a portion of my crontab file on LauraGB:

# Backup /home/gberardi every Monday, 5:30 AM
30 5 * * 1 /home/gberardi/Scripts/BackupLaura.sh

The first line is a comment, signified by the # at the beginning, which means that cron will ignore it. The second line is what cron reads. The first part tells cron when to run, and the second part tells cron what to run. The first part, according to the man page for crontab, is divided as follows:

field allowed values
—– ————–
minute 0-59
hour 0-23
day of month 1-31
month 1-12
day of week 0-7

So as you can see, I have told cron that my job will run at 5:30 AM on Mondays (30 5 * * 1). The asterisks basically tell cron that those entries do not matter.

The second part tells cron to run a Bash script I called BackupLaura.sh, which is where most of the work gets done.

Essentially, it gets the day of the month (1-31) and figures out which week of the month it is (1-5). There are five because it is possible to have five Mondays in a single month. Once it figures out which week it is, it then goes to my 120GB drive and removes the contents of the appropriate backup directory. I called them week1, week2, etc. It then copies all of the files from my home directory to the weekX directory using the rsync utility. I use rsync because using a standard copy utility would change the file data, resulting in files that look like they were all accessed at once. Rsync keeps the same permissions and date access times as they were before the backup.

So tomorrow at 5:30 AM, this script will run. As it will find that the date is 11 (April 11th), it knows that it is in the second week. So the directory week2 will be emptied, and then all files will be copied from my home directory to week2.

That’s all well and good, but you have probably noted that every week, the weekly backup erases the same week’s backup from the month before. When this script runs on May 9th, week2 will be erased, losing all of April’s 2nd week backup data! I’m ok with that.

Here’s why: every 1st Monday of the month, the script will make the week1 backup, but it will ALSO make a monthly backup. It takes week1 and runs another utility on it called tar. Multiple files can be combined into one giant file by using tar. The resulting file can also be compressed. Most people on Windows will create .zip files using similar utilities, but tar fits the Unix programming philosophy: a robust tool that does one thing really well.

Usually tar is used with utilities like gzip or bzip2, but sometimes compression is avoided for stability reasons in corporate environments. Compressing might save you a lot of space, but on the off chance that something goes wrong, you can lose data. And if that data is important enough, the risk isn’t worth the space savings.

In my case, I decided to use bzip2 since it compresses much better than gzip or LZW (what .zip files are compressed with). If I corrupt something, it isn’t the end of the world, since bzip2 has the ability to recover from such problems. So the script will take the directory week1 and compress it into a file with the date embedded in the name. The appropriate line in the script is:

tar -cvvjf $BACKUP_DIR_ROOT/monthly/LauraGBHomeBackup-$(date +%F).tar.bz2 $BACKUPDIR

The $BACKUPSOMETHING are variables in the Bash script that I had defined earlier, but they basically tell the script where to go in my filesystem. The file created is “LauraGBHomeBackup- “+ the date in YYYY-MM-DD format + .tar.bz2. The script runs date +%F and inserts the result in the filename. The file is placed in the directory called monthly. I can manually clean that directory out as necessary, and since the dates will all be unique, none of the monthly backups will get erased and overwritten like the weekly backups.

Conclusion
And so now the need to do backups has been automated. Every week, a copy of my home directory is synced to a second hard drive. Every month, a copy of the data will be compressed into a single file named by date to make it easy for me to search for older data. What’s more, cron will send me an email to let me know that the job ran and also tell me what the output was. If I forgot to mount the second hard drive for some reason and the script couldn’t copy files over, I’ll know about it Monday morning when I wake up.

Once MariaGB is up and running, I can configure the two systems to work together regarding the monthly backups. After the .tar.bz2 file is created, I can then copy it from LauraGB’s monthly directory over to MariaGB in a directory to be determined later. Of course, normally when I copy a file from one machine to another, I would need to manually enter the username and password. I can get around this by letting the two systems consider each other “trusted”, meaning that the connection between the two can be automated, which is consistent with one of the goals of this system. I’m very proud about what I have created, and I am also excited about what I can do in the future.

Currently LauraGB’s home directory has images, code, homework files, game downloads, and other files taking up 4GB of space, which clearly won’t fit on a CDROM uncompressed. To improve my backup system, I will need to purchases a DVD burner. I can place a blank CD in the drive and have the computer automatically burn the monthly backup when it occurs, giving me another copy of my data, one that I can then bring anywhere. Ideally, remote backups would complete my system, but I think I will use those only for specific data, such as my programming projects and business data when I get some. Losing my music files and pictures in a house fire aren’t going to be as big of a conern as the fire itself, but losing sales info, game code, and the customer database, essentially THE BUSINESS, would be something that I probably won’t easily recover from.

For now, I am mitigating the disaster of a failing hard drive, which is my main concern. If you do not have your own backup system in place, either something homemade like my setup or through some commercial product, you’re walking on thin ice. Remember, hard drive failure isn’t just a potential event. It’s a certainty. It WILL fail. Prepare accordingly.

Categories
Games Geek / Technical

LAN Par-tay! or Stupid Processor…

This past weekend I went to a LAN party at my friend’s dorm. For those of you new to the term, you basically take your PC to this party and everyone connects to a Local Area Network (hence the LAN part) to play games for hours on end. This party was in the basement of the dormitory, and it started at 2PM on Saturday and ended at 12PM on Sunday. I think. It was a long night. B-)

While my Debian GNU/Linux system has a new video card and has a slightly faster processor than my Windows system, the fact was that we were playing games, and a lot of them aren’t available for GNU/Linux yet. So I took my Windows machine.

I had problems right away. My Windows machine didn’t have as complete a cooling system as my Debian system. I barely use it, so there was never a need. And I have played games on it before. Yet this weekend of all weekends, games would crash to the desktop. At first I thought that it was possibly Windows 98. I’ve refused to install Windows XP for reasons I may go into another day, but at the urging of others, I installed WIndows XP. Luckily I had brought the CD that I got for free for attending some Microsoft seminar on .NET. Still crashed to the desktop when playing games. So rule out the OS.

I opened the system and found that the ATI Radeon 8500 video card was really, really hot. I think the ribbon cables were blocking airflow. Someone had a spare GeForce 2 MX, so I installed that. I still had crashes, so I opened the case to find that the video card was hot after only a few moments in the system. Was it overheating?

My friend let me use his fan to cool my system. It was funny seeing the case opened and a giant fan blowing into the system, but it kept it quite cool. Unfortunately, games would still crash to the desktop. So rule out overheating.

Someone else insisted I should lower the clock speed on my processor. I had a motherboard that allowed me to flip switches to lower the speed, and I didn’t want to do it at first. I paid money for an AMD XP 2100+, so why lower it? Well, it did the trick. Games stopped crashing, and it still ran quite fast to handle games like Alien vs Predator 2 and Unreal Tournament 2004. I’m still upset that I had to lower the speed, but apparently the processor is overheating otherwise. Perhaps the CPU fan isn’t working well anymore.

Me and my computer woes, eh? Two other people had some issues, but we all eventually got to play.

I haven’t been to a LAN party in a long time, and these days it is less likely since I work 40 hour weeks. It was a completely different situation when I just had school to worry about. It was a good time. I think everyone should attend a LAN party. Besides reminding you that playing games is important if you are going to make good games, it also reminds us that gaming is every bit as social an activity as any other. The next time someone tells you to “put down the controller, go outside, and get a life” remind them that playing video games with friends is a bit more healthy than getting overly drunk at bars and smoking. And arguably more fun.

Categories
Geek / Technical

My Computer is Back and Badder Than Ever!

Last time I mentioned how my Debian GNU/Linux system needed to be upgraded due to a malfunctioning video card.

I am happy to say that my system is running quite fine now. Here is a listing of some of the upgrades and changes:

  • nVidia GeForce2 GTS to nVidia GeForce FX 5500
  • Linux kernel 2.4.24 to Linux kernel 2.6.11
  • Deprecated OSS sound drivers to the new standard ALSA
  • A restrictive hard drive partition scheme to a less restrictive one

The video card runs amazingly well, although my true test is to verify that Quake 3 Arena will run on the system. So far, Tuxracer and Frozen Bubble runs fine. The 2.6 kernel is a good iteration over the 2.4 kernel since it allows the system to respond faster and has more supported hardware under its belt. The OSS sound drivers, for example, have been deprecated in 2.6, but I’ve always used them before since ALSA was iffy at best, requiring a separate download and kernel recompile. Now ALSA is actually part of the kernel, and I am pleased that my system is so much more up to date. Debian is the distro that people make fun of for being less than on the cutting edge, but it makes up for it by being stable.

The hard drive partition scheme I had before made it difficult. I had a lot of space for my home directory, which is where I keep data files and the like, but less space for /usr, which I need for programs, and /tmp, which made installation of new programs difficult. The new scheme doesn’t differentiate between /usr and /tmp, and they share a LOT more space, since I have a bigger hard drive to store any data anyway.

So my system is faster, more convenient, and more compatible with what’s new in the world of technology. Games that were out of reach before are now playable. Not bad for an emergency repair.

Categories
Geek / Technical

Stupid Video Card…

Long story short: I don’t have a working GNU/Linux system at the moment.

Long story long:

Two days ago, my Debian GNU/Linux system worked fine.

Yesterday, the video card was acting up.

The display was a little off, as if the monitor cable was getting interference. I switched cables with the Windows 98 machine I have, so I eliminated the monitor and the cable as the culprits. The GeForce2 GTS that I had won in my first ever eBay auction years ago had finally started to fail.

Just to make sure, I tried to reinstall the Nvidia drivers. For those of you who don’t know, driver installation on a Linux-based system is not a matter of downloading an executable and running it. Nvidia actually does provide something like that, but the drivers have to be made part of the kernel, usually by making a module, and you do that by compiling it.

Well it complained that the compiler used to make the kernel isn’t the same as the compiler I currently have on my system. It’s been awhile, and I’ve upgraded Debian a few times, so that made sense. I decided to try to recompile my kernel since I haven’t done that in a long time and I have been meaning to get some extra features such as USB drive support anyway.

Recompiled, rebooted, and voila! I upgraded the kernel from version 2.4.24 to version 2.4.27, and I was surprised that it only took a few minutes to do so since I’ve had older/slower machines take a half hour or more. It turns out that it was a good thing it was so fast. I apparently forgot to add network support for my onboard ethernet. Whoops.

I attempt to recompile, but then I get strange errors about modules not existing, even though I did the same exact steps to recompile. So I try to install one of the older kernels since I still have some of the packages I had created in the past. Still no network support? ARGH!!

I’ve been meaning to do a fresh install of Debian on this machine anyway. The hard drive partition scheme is more limiting than I had originally anticipated years ago (who thought it was a good idea to make /tmp only 50MB?!).

Since the video card was failing, I decided it was time to get a new one. So I went to Fry’s which was 20 minutes away from my house. I bought a GeForce FX 5500 (w00t!), as well as a Western Digital 160GB drive and some quieter case fans. I bought the drive because I don’t want to end up like Lachlan Gemmell. I already have a 120GB drive that holds a lot of my files. Some of it is backups from my main drive, but a lot of it is made up of data that doesn’t have a copy anywhere. I would hate to lose the .ogg files I’ve ripped from CD or bought from Audio Lunchbox, the games I’ve reviewed for Game Tunnel, pictures of me and my friends, and of course my Subversion repositories for the projects I have been working on. I opted for a second huge hard drive since it would be easier and faster to make at least a second copy of my data. I can decide to get a DVD burner later.

And the fans? My system sounded like a jet engine starting up. I’ve been meaning to fix that problem as well.

Last night I installed the fans, and it was definitely a lot quieter. I decided to leave off the video card and hard drive until today since it was getting late and I needed to figure out how to setup the drive in the first place.

So at the moment, I have a system that can’t connect to the Internet. It still can’t display anything, so it is rendered useless for the most part. All because I tried to fix it when the video card acted up.

The good thing is that I’ve made it quieter, and when I am through with it, it will be even more powerful than before. Doom 3 and Unreal Tournament 2004 are now more easily within my grasp. And I will finally get around to designing an automated backup system. Phoenix rising, indeed.

Categories
Game Development Games Geek / Technical

I’m Live not-at-the-GDC!!

The big news in game development these days has been surrounding the Game Developers Conference. A number of indie developers have covered the event, including David “RM” Michael, Saralah, Xemu, and Thomas Warfield. I’ve had to read about it and see pictures of people I’ve met in person or online, missing out on the fun.

I’ve read a few of the writeups that David Michael wrote for GameDev.net, and I intend to read the rest. I’ve also been reading Game Tunnel’s IGF coverage, including interviews and day-by-day news. It’s just like being there…only not.

Congratulations go out to those who made it to IGF finals! Some amazing games have been made by indie game developers, and they serve as an inspiration to the rest of us. This time next year, I hope to attend.

Categories
Games Geek / Technical General

1UP.com presents The Essential 50

The Essential 50 is 1UP.com’s compilation of the 50 most influential games in video game history. I feel bad because some of the games I’ve only read about, such as Battlezone or Prince of Persia. Others bring back good memories, such as Super Mario Bros and Pac-man. I am making a point to go back and play the games I own that I haven’t gotten a chance to play yet, such as Final Fantasy 7. I was a Nintendo fanboy when it came out so I refused to touch it, but sometime last year I saw a PC version of it for under $20. I still haven’t played it.

It’s sad when you look back on the highlights of gaming and realize that you weren’t there for even half of it. Still, I have some good memories of some good gaming, and there is no reason for me to miss out on what’s to come. B-)

Categories
Geek / Technical Marketing/Business

Marketing for DLC Results

The DePaul Linux Community had the technical presentation last week on Thursday. Before I reveal how well marketing did, I’ll note some general data regarding our previous presentations:

  • there are usually only a few people who show up (between two and six non-members is normal)
  • we usually only post fliers and tell people in our own classes

The difference in marketing this time around: I sent out an email to over 20 professors.

The difference in attendence this time around: we had over 15 non-members.

Only one of them could be directly linked to an email I sent to a professor. The rest said they found out about the event through the website (a marketing tool which I will need to make sure is working to its full potential) and through other friends. I don’t know how many of those friends knew about it from my letter to a professor. Still, this is very encouraging.

Also, we found a lot of people were very happy with the event, titled “Developing for the Modern Web”. Larry Garfield did a great job talking about the wonders of CSS. Many, myself included, were surprised at the number of things that you could do with it. My favorite comment on the feedback forms we had: “Excellent presentation. Thanks for sharing this ‘untaught’ knowledge!”

Categories
Geek / Technical Marketing/Business

Marketing for DLC

I’m a member of the DePaul Linux Community. We usually hold events each quarter, and this quarter is no different.

Last year we determined that we need to do more marketing. All we’ve ever done is post fliers up around campus, and the results have been decent. Unfortunately no one wanted to be the main person responsible for marketing. Since that time, I have learned quite a bit about running a business, and I know that marketing is definitely something I’ll need to get better at if I want to do well. I volunteered this quarter, partly to help the group and partly to practice my marketing skills.

I recently finished Jay Abraham‘s Getting Everything You Can Out of All You’ve Got. At one point he describes direct mail. When I thought of direct mail before, I thought of junk mail, or snail mail spam. It still is for the most part, but I also see that it can be a valid marketing too. It’s still unsolicited, but the marketing message is in its entirety, allowing the reader to get the full message. It’s supposed to have a higher response rate, and it makes sense that it should, especially over fliers.

Today I wrote an email and sent it to a number of faculty members in the CTI school of DePaul. It basically provided the information about the different events, a link to the website, and a blurb about the mailing list we have. It suggested actions to take, explicitly asking for them to post a section of the email on the announcements page of their class websites and making in-class announcements as well.

Unfortunately our first event is only a few days away, but hopefully the turnout will still be improved by this email alone. And it also sets the stage for the next few events.

Categories
Geek / Technical

Confusion about the GPL’s Purpose

While googling for open source game development to see what was out there, I came across Jeff Dillon’s blog. In his entry on June 18th, 2004, he argues that the GPL needs to be updated because software is not being run on the localhost anymore. The example he mentions is Google. It uses Gnu/Linux but they don’t have to provide any of the changes they might make to the end user. He thinks this goes against the purpose of the GPL:

The GPL Needs Updating

This means that any online services company can use all the Open Source work they want without ever giving anything back. This was not the original intent of the GPL. The original intent was to bring progress to software by sharing innovations. Google or any other online service company can now use all open source code without ever showing anyone what they have done.

Copyright law is frequently misunderstood, and the GPL is no exception. The purpose of the GPL is not to spread innovation. The purpose was to secure freedom for the user. Google is able to do what it does because the GPL says they can. The moment they release the modifications to some other party, that’s when they will have to distribute the source (or promise to provide it upon request) as well. Does Google get to exercise the freedoms provided by the GPL? Yes. Does the end user of Google’s services lose any freedoms? No. They have the same freedoms that Google has to use the same software. If Google doesn’t release the changes they made, the end user doesn’t get to use those changes and so hasn’t lost any freedom.

If the GPL needs to be updated, it isn’t because of a need to foster innovation.