Jump to content

Suse 9.3 - Restoring One Image Onto Identical Pc's


Recommended Posts

Posted

Hi,

I've just installed SUSE 9.3 on one PC and configured things as I like it. So far a smoothless process. Well, I have 5 more identical PC's I'd like to be set up in an identical way.

On Windows I used DriveImage to restore an image, created on one PC, onto the other computers. All I had to do to make all PC's work perfectly on the network was to change computer name, so that each one was unique.

Is a similiar thing possible under SUSE 9.3? If that's the case - how to do?

I also have 4 other identical PC's (different from the 6 ones above). Under Windows I had to make a seperate image for these (too many driver problems etc.). Could things be a bit easier on SUSE?

Posted

Hi,

You can use Partimage, an utility which saves partitions in many formats.

It can be installed standalone on GNU/Linux systems and is also included in SystemRescueCD a bootable cdrom containing many utilities.

I have it installed on all my Linux distros (the static one: partimage-0.6.4 (1.04 MB)) and even the compiled one on Sourcemage.

Many filesystems are supported, even for Windows and Power PC/iMAc.

Posted

hi'

norton ghost do regognize linux now :o in version 10!

make a trip to panthip :D

you'll have to boot from the cd and then follow the onsecreen instruction :D

you know, the old hdd as master and the new one as slave and hop transfer!

francois

Posted

Thanks. I've downloaded, burned and checked out that SystemRescueCD.ISO. It works to my full satisfaction. It even appears to have (at least) one advantage over my long time companion Drive Image, If DriveImage can't detect the network automatically, you're left with no option to help it out. Partimage seems to grant you the option to set the network parameters manually. (However - as a linux-newbee I have yet to figure out what the heck I'm supposed to do with that option).

I suspect that Norton Ghost thing requires an expensive licence to be legal? I'm sort of aiming at becoming the first legal Internet Cafe on my block. Since I'm already handicapped by having to pay VAT (which I can't make my customers pay) I certainly wish to minimise the number of licenses I'll have to pay.

  • 4 weeks later...
Posted

People who are very experienced with Linux wouldn't probably bother with a "disk cloning" tool. It is easy to do the steps manually via any "rescue media":

1. partition the new host

2. format the filesystem(s)

3. download or transfer (USB disk etc) an archive image (tar file) from your "master"

4. unpack the tar file into empty filesystem(s)

5. initialize the bootloader in the bootsector, e.g. run appropriate GRUB or LILO setup command

6. change any system files you need to, e.g. /etc/fstab if you changed partitioning or have different drive layout

there are really no special file positions on disk with Linux, save for the bootloader blocks that are taken care of in step 5. This has the added benefit of it being easy to change the partitioning between the source and destination machines, since you are just copying files around.

Note, your solution probably works for client-only workstations where you do not care too much about security, but is kind of "wrong". There are lots of per-host files such as server host keys which should not normally be shared between multiple computers. I do not know about SUSE, but in the RedHat/Fedora world there is "kickstart" to get a description of an install and automatically install the same OS onto multiple computers so each one gets its own unique private files, just as if it were installed manually. This has the benefit of being able to handle machines that require different device drivers (kernel modules) to be enabled at boot time.

This is what people who manage large sets of PCs would do. The trick is to figure out how to operate with standard package configurations so there does not need to be much (if any) custom setup after installation. Often, you would use network-based authentication and user home directories so most configuration is centralized on the LAN. Kickstart allows scripting of some customizations, but I think usually people who use it actually generate extra custom packages to get things the way they like for a big school or corporate LAN. The ideal is to have kickstart files represent everything needed to get the blank machine into its operating state.

In the cluster space, people do something else that might have interested you if you were still planning your cafe. They run "diskless" machines which basically boot off the network every time, not holding any files locally. They may or may not run a local disk for "swap" space. This means you do not have to worry about some sneaky or clumsy user corrupting the local disk. Reboot and you get a clean image. There are different techniques that have different amounts of shared vs. unique data per host, stored in either case on a file-server in your LAN. On one extreme, you have systems like ROCKS which basically reinstall Linux automatically much like kickstart whenever they need to be rebooted. On the other extreme, you have systems where the Linux image is installed manually but resides on a network file server instead of on local disk. In between, you have systems where a small ramdisk image is loaded into each client, and they share a read-only installed image on the network file server. To admin this, you need to have familiarity with which parts of the system image are read-only, which are written by programs during runtime (and need separate copies for each host) and which are written during software configuration time. There is a clustering tool called SLURM which helps run things in this mode. (Both ROCKS and SLURM are open-source/free software.)

There are similar methods that can use a "live CD" to reduce the amount of data that goes over the network... just burn a standard CD copy for each machine and let them boot from that. Of course, the CD drive shouldn't be accessible to the user or they could remove it and boot something else on your LAN.

For a case like yours, a very stateless model could be used where the client machine holds session state in its ramdisk and discards it on reboot. Cause a reboot after every user session. That way every machine gets a clean copy of your "master client" image on startup, and there is less change of a user leaving personal data or sneaky software around for the next user to encounter.

Posted

rishi has asked for a similar tool :

On Windows I used DriveImage to restore an image, created on one PC, onto the other computers. All I had to do to make all PC's work perfectly on the network was to change computer name, so that each one was unique.

Is a similiar thing possible under SUSE 9.3? If that's the case - how to do?

So I pointed him to a similar tool. :o

Posted

Just like with Thailand, I think it helps people function in the Linux world if they know a little more about the history and the ways of thinking practiced by the natives... people don't use specialized disk cloning tools because the regular system administration tools can do it all and with lots of options. Similarly, I was trying to inform him of possibly better alternatives to cloning for his cafe by pointing out the popular and free clustering software.

And, I've always felt that what people ask for is a very poor indicator of what they need, and a useless constraint on what they'll get from me. When someone asks for a hammer that won't hurt their thumb as much, I don't give them a nerf hammer. :o

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.



×
×
  • Create New...