Copy a single file using robocopy from a local folder to a
shared folder on the network.
A simple rule of thumb before any disaster strike, don't
interchange the source and the destination.
If source and destination is mistakenly reverse, files might get overwritten. To avoid any loss of data do a test with a dummy file to ensure things work perfectly.
Robocopy [source][destination] [file to be copied] robocopy c:\local_c_folder \\PC_network\shared_folder
The command will be completed successfully provided the
network access right has no issues.
Robocopy works quite good on large files. A simple copy or xcopy command will also work but the speed might vary.
Robocopy is free it can be accessed from command line. No need to install the resource kit tool if the operating system is Windows 7 or newer version.
Copy files with selected file extension using PowerShell and Robocopy:
Multi-homed server, a server with multi-services and is set with different IPs is quite common if there is a budget constraint. It is always good to have a dedicated server do a single function provided there is enough luxury to do it.
So if you just inherited the network and all the servers in your environment have documentation which is poor to nothing, then pray hard that nothing will happen until you have it under control by proper documentation and familiarization.
It's quite tough to troubleshoot when a problem arises if no proper documentation to depend on and there will be a lot of surprises as the work journey goes on.
Listing or getting the IP Addresses on your entire server and having a proper documentation will definitely help and ease the tension during some issues.
It’s also easy to troubleshoot during a network outage or other circumstances that may arise.
Listing the IP Address of a server can be done in different ways.
Doing it remotely is quite convenient and of…
Copying or backing up files to the remote machine is a good practice if hard disk space is not a concern.
Having a backup on the same machine is like the old saying "putting eggs in one basket".
Backing up data on another machine would simply eradicate the worries of losing data provided that the backup is really a backup.
Backup should be tested as often as possible.
Backup is for disaster recovery and it should be able to cover the disaster and not add a burden.
In Linux "tar" is a good old tool which does a pretty awesome job to backup or copy files.
TAR - tape archiver, as the name implies it is designed for tape backup.
To automate backup using “cron” scheduler, "tar" is a good choice since it will only require the 'user name', 'remote machine IP address or hostname' and the path on the remote machine where the file should be copied.
Once the above requirements are known, tar will be able to copy to the remote machine and will not as…
Deleting files are good and bad.
It is good to delete files because it will tidy up your computer, get rid of unwanted or outdated files and free up space on the drive.
It is bad to delete file or files if no backup is available and the data is of great importance.
So deleting files should be done carefully and cautiously to make sure that deleted data will not be needed anymore.
How to delete files automatically in all folders and subfolders?
The batch file will come handy to automate file deletion.
Below is an example that deletes files on the temp folder for any “dll” files.
This can be applied to any folder path but don't do this on the c:\windows folder or the system will be completely unusable.
Open a command prompt and copy and paste the commands below or save it to notepad and run the script as a batch file. C:\Users\\AppData\Local\Temp>FOR /f "tokens=*" %a in ('dir *.dll/B /S') DO del %a
Above command will find all the “.dll” files on the temp folder …