The need for file archives
Did you ever edit a file, save it and then say oh S**$? Were you ever working on your code at 3am as to find a zillion errors that you did not expect? Did you ever do “#\rm -fr *? because you worked an 80 hour week and you lived on espresso ?
As a developer I had countless times when I wish I saved a copy of a script, .h, .c or .java before a fat finger hit dd or 1000d (and of course the :wq to save in vi). svn, git or cvs will not have saved me since I did not finish testing the code to checked in. To solve this problem, I continuously backup my sandbox to a remote server and I can revert to previous states of my code any time.
You are working on some super secret project and you don’t want Bob the system administrator at a remote backup location to take your data and sell it to Joe the competitor do you?
So you as a loyal and trustworthy sysadmin will solve this problem using your own private cloud over which your have full control.
Creating unlimited archives
EDpCloud can create an unlimited number of archives on a remote site for safe backup. All you have to do is (a) configure realtime replication or (b) create a schedule to backup data continuously and configure the receiver to create a new archive file each time the content of the file changes. It will save your employer money and save your reputation as well as sanity.
For example, if file foo.c changes on the local machine, you will have foo.c and its older archive foo.c.$md5 where $md5 is the md5 signature of foo.c before it changed.
Example configuration
The archive parameter controls the ability to go back and restore various versions of the file. (See eddist.cfg man page (UNIX) or eddist.cfg.html (Windows and all the other operating systems).
<?xml version=”1.0″ encoding=”UTF-8″?>
<config name=”enduradata” password=”Addoud4d4ch1n1gh4T4s4″ workers=”4″>
<link name=”u” password=”foo”>
<sender hostname=”localhost” alias=”*”
/>
<receiver hostname=”192.168.200.241″
storepath=”/home/backup” archive=”1″ archivedir=”/home/archives”
/>
</link>
</config>
The archive parameter controls the ability to go back and restore various versions of the file.
Best practice
A good practice is to setup a link (replication set) for your sandbox or for your entire working directory, configure file replication for it and set up the archive parameter, specifically for that directory, and setup a schedule to backup your files. Make sure you setup an include regex to exclude .o files.
–a elhaddi
More Tips:
- Edge computing, file replication and data synchronisation
- Data Management Challenges in HealthCare
- The Future of Data Replication: Trends and Innovations
- How to troubleshoot Linux file replication connectivity issues
- EnduraData Releases EDpCloud Version 6.0.9: Enhanced Linux Data Replication and File Transfer Solutions
- Factors That Impact File Replication and Synchronization Performance
- Enhancing SSA’s Data Systems with EnduraData: Mastering Linux File Replication, File Synchronization, and Web Farm Data Replication
- Future-Proofing Your Data Strategy: The Essential Role of File Replication in Navigating Cloud Complexity
- Linux File Replication Installation and Configuration
- EnduraData: Your Comprehensive Data Protection Solution for Regulatory Compliance
- Debunking 5 Common Myths About Data Management: What Every Business Needs to Know
- EnduraData EDpCloud on SUSE SLES-15-SP4-64bit Linux (SUSE.COM) Certification Overview
- How System Administrators Can Utilize EnduraData EDpCloud for Robust Data Backup
- Risks of Not Replacing RepliWeb file replication: Impact of Discontinued Support and Unavailability of Software Updates
- Leveraging File Replication for Efficient Data Migration: Best Practices and Strategies
Share this Post