A friend has an aging linux server on which he runs a web app that he
custom built. The web app's written in Perl and has grown organically
with libraries and other dependencies added over the years. The
server's getting a bit long in the tooth and my concern is a proper
backup is needed.
My first instinct is that given he doesn't have an exact handle on
dependencies etc. he needs to ensure that he has a backed up copy of
the entire system "as is"... and that for backup purposes as well as
having something to properly dev on, a virtual machine is the way to
go. With a virtual machine, one then has some breathing space to do
proper dependency analysis etc. in preparation for a clean OS install
and should current hardware die, provided that file and db changes have
been backed up in interim, wouldn't be too hard to switch over to VM.
The server uses Linux software RAID1
# mdadm --detail /dev/md1
Version : 00.90.03
Creation Time : Sun Mar 26 17:57:19 2006
Raid Level : raid1
Array Size : 76196224 (72.67 GiB 78.02 GB)
Used Dev Size : 76196224 (72.67 GiB 78.02 GB)
Raid Devices : 2
Total Devices : 2
Preferred Minor : 1
Persistence : Superblock is persistent
Update Time : Mon Jan 28 17:35:20 2013
State : active
Active Devices : 2
Working Devices : 2
Failed Devices : 0
Spare Devices : 0
How would you image the system, to P2V it?
In theory I think one could:
* Image a single drive to a virtual disk.. boot into degraded state,
recover or drop raid.
* Image both drives to virtual disks.. boot into normal state
* Build a VM using same underlying OS then using a live environment on
both systems, rsync files from source over destination, overwriting
destination entirely.. Fix fstab/raid config..
Not entirely sure on which tools to use tho to do this, pitfalls and
such.. Also friend wants to minimise downtime as the webapp powers a
relative's web store.
Quoting Rick Moen <rick(a)linuxmafia.com>
> Quoting James Harper (james.harper(a)bendigoit.com.au):
>> When I do 'apt-get update', 'dpkg -l' doesn't know about the new
>> packages and so isn't useful for searching. >
> Wrong tool. I think you're looking for 'apt-cache'.
just for the record........
Its likely dpkg -l uses the file /var/lib/dpkg/available as its source this
is updated by dselect. One can update this file by running dselect and
selecting the menu item "1. [U]pdate".
I use apt-get, dselect and dpkg, each has its own strengths and weakness's.
I find dselect particularly good at telling one what is being drag into
satisfy dependecies, in many cases (for me any way) telling you its pulling
something you ____really____ do not want.
I do not want a system with a lot of stuff on it I will never use, apt
being real good at installing masses of stuff although.....
in /etc/apt/apt.conf helps a lot.
When I do 'apt-get update', 'dpkg -l' doesn't know about the new packages and so isn't useful for searching. How do I update dpkg's database these days? Or how can I do a similar search in apt's database? I know I used to be able to do it with dselect but I much prefer the command line when searching.
I have two Linux laptops with the latest updates to the most recent LTS
1). Samsung netbook. This uses the Broadcom wl wifi driver and works fine
all the time everywhere.
2). Lenovo W520. This uses the iwlwifi driver and works on every network I
have tried it on except Latrobe University. I have tried both LTUwireless2
or Eduroam. It is impossible to keep a connection to either of these (in La
Trobe) for longer than about 2 minutes. Outside La Trobe Eduroam works just
The configuration settings are the same on both boxen.
Up until a few months ago the W520 worked fine on the La Trobe network. IT
support claim that no Linux box can connect to their wifi or ever could. As
I was using it with the Samsung half an hour ago I know this is false.
Obviously something has changed but where to start looking, is it my
driver, my hardware or has something changed on the network and if so what.
Ideas most welcome
a friend of mine is running a Joomla based website
The first page renders fine on Windows and Apple devices
but has a full page of spam gibberish <a href>'s to the most wanted items
if I believe in spam) before the header menu starts (so I have to scroll
down to see any useful content)
I tried on two Ubuntu machines running Ubuntu 12.04, and tried Chrome and
Firefox, always with the rubbish in it.
Stranger, it does not even appear in the page source if I look at it using
Windows or Apple machines.
I did not look at the server yet (and my friend isn't an IT expert) but I
find it that weired that I don't have a slightest idea what the cause
Thanks for answers
On 24/01/2013 12:16 PM, "Russell Coker" <russell(a)coker.com.au> wrote:
> Unison sounds interesting, it's something to consider for future uses,
> John Mann.
While unison is good (I use it all the time), it is no longer actively
developed. See http://www.cis.upenn.edu/~bcpierce/unison/status.html
I wonder whether I can prevent outgoing "bulk e-mail".
A disgruntled leaving employee was threatening sending mail to all
customers. (It did not happen but management was very unhappy and asked
what I could do against it, from the technical side).
I have a postfix mail relay (upstream going to the provider) but I cannot
see a way of telling it "reject all mails with more than 50 addresses in
the header", as an example.
We have a special "list server" to fascilitate legitimate sending to many
recipients at the time. It is bypassing this relay - so I don't have to
worry about it.
Thanks for answers
On 24/01/2013 12:16 PM, Russell Coker wrote:
> The -n and -i options suggested by Chris and James are useful. They do what I
> originally wanted to do. But it turns out that the batch support is even
> better. The rsync people are really smart.
Whilst I did learn a thing or two with this thread, thank you very much,
I am astounded that you can seem to have no trouble reading Wikipedia
articles to attack people, but you can't seem to read a man page?
It's not the first time something so basic about a Linux tool or
operation has escaped your vast experience and that really surprises me.
The other most significant oversight from you was the checkarray
processing of software RAID. Just the same, interesting, I'm sure you
are a very knowledgeable Linux user, but sometimes I just wonder in
I'm looking for an Optus-approved cable modem whose Ethernet port
I can plug into my WNDR3800 router such that the WNDR3800 is directly
exposed to the public IP address, with no private addressing or NAT in
the way. Which one, if any, should I choose?
Where I live, I'm stuck with Optus cable as my broadband internet (short
story is we're too far from the exchange to get ADSL, despite being in
the middle of suburban Melbourne), and we're looking at upgrading our
plan to something faster. We currently get something like 8-10 Mbit/sec
down and 128-256 Kbit/sec up, and we can allegedly upgrade to 20
Mbit/sec down and 2 Mbit/sec up.
Trouble is, to do this Optus tell us we need to upgrade our modem.
Currently we have a Motorola SB5101, which takes the coax cable in, and
spits out a single Ethernet interface that goes into the WAN port on my
OpenWRT Netgear WNDR3800 router, and just gives me a public IP address
on the router.
Below is the approved equipment list from Optus:
Motorola SB4100, SB4101, SB4200, SB5100, SB5101
Netgear CG814WG, CVG824G, CG3000
CISCO EMTA2203, DPQ3212, DPQ3925
I note that my modem is already on there, but allegedly (second-hand
information) the woman at Optus suggested a Netgear or a CISCO.
So having reviewed a couple of the Netgear modems (on the basis thay
they're almost certainly cheaper than the CISCOs), I've found that
they've got a lot of router smarts in them, and have 4 LAN ports, WiFi,
NAT, and all manner of other things that you want in a router, but that
I don't want in a modem running proprietary software. I would like to be
able to plug a modem's Ethernet port directly into my current router,
and have *my* router do all the hard work of NATting and private
addressing etc. Does anybody know if any of the modems listed above can
do this, while supporting whatever new stuff is required to obtain the
20 Mbit/sec speeds being offered (I don't know enough about cable
internet to understand what exactly needs upgrading)?
Is there a way of getting rsync to just list what it would do without doing
anything? Failing that is there any other way to get a list of changes
between two trees of files where one of them is remote?
I've got an archive of 30G+ of video files that needs to be synchronised
occasionally with a system that has limited quota. I want to get a list of
new/changed files that I can pass as parameters to tar and then take a tar file
on a USB stick with the new files.
I know I could run "find ." on both sides and then compare the output files, but
I'd prefer something less hacky if possible.
My Main Blog http://etbe.coker.com.au/
My Documents Blog http://doc.coker.com.au/