I spent 20+ years in corporate 100 IT and had many management titles during that time period (and some unofficial titles given to me by coworkers who did not like my strict ideas.) IT is not about being totally smart and knowing everything; rather, it is about learning, evolving, establishing best-practices, and learning from mistakes which will happen. It is about being serious about the job and about thinking beyond 5 p.m. each day.
I was also in the U.S.A.F. in Communication Security. ’nuff said. Any mistake can be a life-or-death situation for someone – often yourself as anyone who has ever looked down the barrel of an M16 can attest (… what was that password of the day?)
Today, I thought I would share my personal method of backup. I will refer to the server hardware as “seconds” PC; purchased refurbished but with new hard drives installed and a very extended test of RAM. Each server needs only 2G of RAM for Linux or Windows XP but I personally like to buy used hardware with 4G RAM as the price is generally identical; but sometimes you have to look a bit harder. The two servers do not have to be identical hardware or even running the same OS: here we are concerned with “data” on a file system: I have used both Windows and Linux. I use a KVM switch so there is only 1 keyboard/LCD/mouse physically for up to 4 servers. One can also look into free software such as VNC for a headless installation.
(1) The primary server is always left on: 7x24x365 and taken down once or twice a year to vacuum the air flow areas and blow dust off the CPU cooler. The (2) secondary server is always left off, except when I feel the need to make a full backup of the primary: for me, this is after something significant such as B-I-G installing a new version of Arduino or GCC or even buying a new piece of hardware such as the ESP8266. Then, it is a Saturday or Sunday afternoon process of bringing up the (2) secondary box with the WAN connection to Comcast unplugged. OF course, the LAN connection to the WiFi routers is available as is the 10/100/1000 24-port network switch. Do not buy a cheap switch – the LAN switch is no place to be anal.
The primary server (1) has 2 hard disks but does not need an IDE CD reader or writer. The primary HD is sized based upon your backup needs: I’ve grown over the years from 320G to 500G to 1T. The second drive should be a similar size to the first (you should use clone software to make an identical copy of the primary partition(s) depending upon personal choice. In my case, I have a C: partition for boot/OS and I have a D: partition for data. The second drive is identical. However, the second partition on the second drive is used by sched to backup the D: data partition on the first drive. This is automatic and happens daily at around 2 a.m. when the cron job kicks off.
Recap: One box, two drives of same size, starting out as identical copies with no user data. Why? If the 1st drive dies, the second can be reconfigured to BOOT. Remember that the second partition will likely be patched by Microsoft/Linux for the OS, so be very careful with any update that identifies any change in the OS file-system drivers. For now, avoid Win10 as there simply is not enough information for me to think Micro$oft will not be messing with us.
The second box, can be less capable than the 1st server, simply make sure you have a data partition that can accommodate all your data.
Important concept: The OS “share name” should be the same for both data partition(s); example: Arduino_BU. The Server names should be different. Yes, this means that any workstations you use that are using scripts to map the server shares for backup will need editing since the name is different; however, if you assign the server and the backup server TCP/IP addresses outside the DHCP address range of your router, then you can always write your backup scripts using IP convention rather than the server DDNS or NEtBIOS name. Example: \\192.168.100.51\
So, two different named boxes, two IP static IP addresses, both with a data partition the same size and the same share name.
Server #1 has 3 partitions and two drives:
Drive 1: OS + Data partition
Drive 2: data partition for daily-processed automatic backup (Tower-of-Hanoi using free MS backup software).
Server #2 has 2 partitions, one for the OS and one for the data. Backup is done server 1 to server 2 while the WLAN is disconnected since server 2 is never/rarely updated for OS updates. You can breach my suggestion here if you want to assume a little additional risk. What I usually do, is make a server 2 backup when something serious happens in my network domain.
Examples:
– Roger releases a new STM32 core and I decide to upgrade
– I decide to take on a new piece of hardware and must augment/update Arduion
– Tax time and I have finished the input process
The nice concept here is that you can selectively update server #2 by drag ‘n dropping just selected directories of interest. You can also create batch files to do the nasty for you.
So, how does our data age:
– Original, on computing device
– Backup, on server #1 by operator/batch directory/file transfer (manual or automated). Remember, the Server #1 resource is 7×24.
– Automated backup on server #1 (nightly) using free backup software. Backup is to a dedicated partition on physically separate (bootable) drive
– Snapshot backup on server #2 of critical data (or all as warranted by end-user needs)
There should be absolutely no need to utilize a browser on server #1 or server #2 with the single exception that server #1 should be kept OS updated for pertinent security patches. Always make a critical backup from server #1 to server #2 before trusting Microsoft.
I have used slight variations of the above since 1991 when one server was Novell 4 and the other server was NT 3.5.1 and the “automated” backup was done to DAC tape. To my knowledge, I have never lost a file in 25 years while I know that cannot be said of my old company who often played fast and loose when IT Security was not paying close attention.
Side benefits: Server #1 can be used to host a print queue for a local printer making it network available. This is great for recycling a laser or color printer that is not network aware. You can find these things for $25 with some toner left in the cartridges. I would strongly advise at not going too far back before the days of the Energy Star devices which turn off the fuser and non-network components until the printer is needed.
For folks that are space challenged and just cannot set-up a separate location for a server, the external USB hard drives can server as “backup #1” and a separate USB drive can be “backup #2” … or, if USB wires drive you crazy, you may be able to get away with 64G or 128G USB sticks.
The point is, technology implementation is secondary to having a quality backup. For most of us, you will appreciate your anal behavior when you accidently delete that primary copy. And remember, you primary backup and secondary backup should be physically separate; even in separate buildings if the data is very valuable.
Good backups!
Ray
To my knowledge, I have never lost a file in 25 years while I know that cannot be said of my old company who often played fast and loose when IT Security was not paying close attention.
I just a two-drive Synology DS212 NAS on my LAN. For years.
Drive 1 has the shares where I keep all files. I never keep my files solely on the PCs. Centered Systems’ Second Copy software automates ensuring there’s never just one copy on the PC.
Drive 2 on the NAS (not a RAID setup by choice)… gets an incremental backup of drive 1 every day, automated. Could be more frequent if I chose, but Second Copy does every 2 hours for VIP folders.
Drive 2 ALSO has a Synology’s “Time Backup”. It keeps the last 6 months of all versions of files in important folder shares that I designate. More than once, I use the GUI to browse prior versions and get back some file I hosed up.
The NAS copies VVIP files to the 32GB SD card in the NAS.
And once a week, the full NAS gets copied incrementally to a USB3 2TB drive which is removed and kept out of sight/safe from fire.
On the PCs, I do drive image backups (incremental) automatically every day – target is the NAS. Using Acronis.
And my main desktop – I clone the boot drive (a 500GB SSD) to a 1TB hard disk every couple of weeks. Takes 20 minutes. If I get a virus, I just clone the hard disk back to the SSD and resueme – weekend long reinstall of the OS and apps. I’ve used this to save the buns a few times.
No, none, nada personal/finanical/professional work on cloud servers. I do keep family photos and non-sensitive info on Adrive.com, 100GB @ $25/yr. Auto-backup.
I pay attention to this, as my finances and all personal is on these drives. And my work-at-home consulting business files are on the NAS – mostly C source code and projects for embedded systems work.
The NAS takes 20W watts. And it sleeps when I do.
But this proved to be overkill for us and I took the disks out and used them as external drives again to directly backup.
Work stuff is all now in Git on the cloud, so I have a copy, the cloud (Altassian) has a copy and the client has 1 or more copies.
I feel this is due diligence
Anything that is important gets burnt to DVD and put in a fire safe, as I don’t trust any disk drives that could get their contents encrypted by a bitlocker virus. But DVD’s don’t have that issue.
Albeit DVD’d degrade over time, and you need to keep them in a cool dark place.
I think someone has developed some sort of DVD style drive that is designed not to degrade (I think its a completely different technology), so in the longer term this would be a better option than DVD.
However ultimately there is GB;’s of data that I have, which if lost would not be the end of the world.
I have thousands of old photos in a box in the attic that I never look at, and even more digital photos that I never look at.
So I view most of the data as ethereal…
Live in the moment ![]()
But this proved to be overkill for us and I took the disks out and used them as external drives again to directly backup.
Work stuff is all now in Git on the cloud, so I have a copy, the cloud (Altassian) has a copy and the client has 1 or more copies.
I feel this is due diligence
Anything that is important gets burnt to DVD and put in a fire safe, as I don’t trust any disk drives that could get their contents encrypted by a bitlocker virus. But DVD’s don’t have that issue.
Albeit DVD’d degrade over time, and you need to keep them in a cool dark place.
I think someone has developed some sort of DVD style drive that is designed not to degrade (I think its a completely different technology), so in the longer term this would be a better option than DVD.
However ultimately there is GB;’s of data that I have, which if lost would not be the end of the world.
I have thousands of old photos in a box in the attic that I never look at, and even more digital photos that I never look at.
So I view most of the data as ethereal…
Live in the moment ![]()
My clients stuff is not so secret that they need it on encrypted servers.
My clients stuff is not so secret that they need it on encrypted servers.
<…>
My last two clients were typical… I’m told to NOT put source code in the “cloud”, no matter if I can argue that it’s passworded, can’t be stolen, etc. My client logs into my NAS to pull a copy of what I release.
<…>
My last two clients were typical… I’m told to NOT put source code in the “cloud”, no matter if I can argue that it’s passworded, can’t be stolen, etc. My client logs into my NAS to pull a copy of what I release.
<…>
… Many (some large) cloud service providers have had rogue employees or ex-contractors do bad things: data theft, sabotage.
I was a customer of one such cloud service where the ex-contractor stole and sold customer info, then induced damage to the databases.
And there’s this famous one too: https://twitter.com/jasonpnoel/status/2 … 9305871361
The private NAS is HTTPS based and client’s login gets access to only their directory in which I put what they need to see.
Nothing is perfect.
unless they want 24/7 access, send them a cd/dvd.
needless to say my people skills are not those of a diplomat.
stephen
unless they want 24/7 access, send them a cd/dvd.
needless to say my people skills are not those of a diplomat.
stephen
But, the largest project on which I subcontracted as a consultant through my company, was valued at over $1B, $250M for hardware, $250M for capital software, and $500M for system coding labor hours. Such numbers made me very anal … like I was not already anal from my communications security days in the USAF.

