Wednesday, April 27, 2016

Create Logical "Home" Volume When Using Logical Volume Management (VLM)

1. Getting Started. Introductory Material.

This guide assumes that logical volumes have been installed and that there is no volume dedicated to the “home” directory. The process for establishing a “home” directory using partitions is described in Ubuntu documentation Partitioning/Home/Moving. The process for using logical volumes is similar, but involves a substantially different process. Reading the Ubuntu documentation before beginning will be beneficial.

The nature of logical volume management (LVM) is described here. What is LVM

The following document is being referenced for informational purposes: HowTo: Set up Ubuntu Desktop with LVM Partitions

This tutorial was based on using Ubuntu 16.04.

2. Preliminary Step. Determine Volume Allocations.

Determine the amount of space that needs to be dedicated to the operating system (root) and the amount of space that should be allocated to “home”. Essentially the process you will be following will involve splitting the “root” volume into two volumes. One volume will remain "root" and a new volume “newhome” will be created. In the table below, “root” has an allocation of 915G. Fifteen Gigabytes of data have been installed of which 4.3G occurs within the “home” directory.

Size of the Hard Disk - 1000G
Applicable Volume
Current Allocation (Usage)
Proposed Allocation
Partition1 (Boot)
915 (15)
0 (4.3)
Total (approx)

Obtain your logical volume group name and volume name by using the Logical VolumeManagement application. This application is available through the Ubuntu Software Center by entering "LVM". This information is also available through the terminal by issuing the command "sudo lvdisplay" In this example the logical group name is: “ubuntu-vg”. The logical volume name is “root”.

Backup your computer before going further!

3. Early Steps.

Boot the computer from a live CD.
The “root” volume must be unmounted.

Before attempting to use “apt-get” to download packages run “apt-get update” to get the software list updated should downloading be necessary.

Below are two reference resources that I relied on. Please note that the syntax used in some of the examples provided may not be available in Ubuntu.

Redhat LVMConfiguration Examples

ArchLinux Wiki: LVM

4. Reduce the size of both the root volume and root file system.

Approach #1. The preferred approach.
a. Run:lvresize -L 100G -r ubuntu-vg/root 

Approach #2.
a. Run: “e2fsck-f /dev/ubuntu-vg/root”.
b. Run: “resize2fs /dev/ubuntu-vg/root 70G
c. Run: “lvresize -L 100G ubuntu-vg/root
d. Run: “Run “resize2fs /dev/ubuntu-vg/root

Notes: Apparently I did something wrong when using “lvresize” since it would not recognize the “-r” option. Consequently,  I used Approach #2. Use Approach #1 if you can.

ArchLinuxWiki examples for reducing the size of the volume and file system.

5. Create new volume and establish file system.

a. Run: "lvcreate -l 100%FREE ubuntu-vg -n newhome /dev/sda5
b. Run: "mkfs.ext4 /dev/mapper/ubuntu--vg-newhome"  
ArchLinux Wiki example for creating a volume.

To view the results of the preceding operations use: "lvdisplay"

6. Modify the "fstab" file to mount "newhome" under /media/home

Modify the  /etc/fstab file to add the lines:

# Logical volume for home
/dev/mapper/ubuntu--vg-newhome /media/home        ext4         errors=remount-ro       0       2

This can be done before you re-boot or after you reboot into normal operation. From this point on you can follow the prior Ubuntu documentation on creating a home partition stating with the section: "Copy /home to the New Partition".

7. Copy home into the new /media/home volume.

a. Run: "sudo rsync -aXS --exclude='/*/.gvfs' /home/. /media/home/. "

8. Modify the "fstab" file to mount "newhome" under /home.

# Logical volume for home
/dev/mapper/ubuntu--vg-newhome /home ext4 errors=remount-ro 0 2

Don't reboot.

9. Save "old" home by copying to "old_home"  and creating empty home folder.


a. Run: "cd / && sudo mv /home /old_home && sudo mkdir /home"

OK to reboot. 

Final Notes: I did this once and kept the notes you see above, I will not claim to be any sort of expert. Evidently, importing the text from LibreOffice wasn't helpful in terms of formatting this web-page. Please let me know if there are any correction and/or modifications that would benefit future readers.


Tuesday, February 16, 2016

The Multi-Year Science Fiction Magazine Database Extravaganza Endlessly Continues II

Screen shot of the revised main page on my testing version. As much as I hated to do so, I removed the Galaxy cover graphic as being too distracting. Now I have a lot more "real estate"  for displaying a magazine's data. I will continue to display the Galaxy cover as the opportunity arises and where it will fit-in.

Other enhancements, from the prior version:
  1. Edit issue and story data directly from the interface.
  2. Edit the names of the story author and the cover artist from the interface.
  3. Add the names of new authors and new cover artists from the interface.
  4. Add missing stories to an issue from the interface.
Previously, to edit and/or add certain data such as those identified above, had to be done through phpMyAdmin.

Tuesday, December 22, 2015

The Fallacy and Idiocy of the so-called "Golden Key" to Break Encryption

The recent terrorist attacks in Paris (France) and San Bernardino (California) have invigorated the demands of clueless politicians for the creation of a so-called "Golden Key" that would allow law enforcement to decrypt encrypted communications.  In a recent Washington Post Article: After terrorist attacks, the debate over encryption gets new life; the Post notes that: "On Wednesday, Sen. Dianne Feinstein (D-Calif.) became the latest senior lawmaker to call for such legislation. “If there is a conspiracy going on” among terrorist suspects using encrypted devices, “that encryption ought to be able to be pierced,” said Feinstein, vice chairman of the Senate Intelligence Committee."  In regards to another clueless politician, the Post wrote: Kasich doesn’t understand how the tech that keeps you safe online works.

The problem is that should a "Golden Key" actually be developed and implemented as demanded by the clueless politicians, the "bad" guys, such as the terrorists, will also be able to use the "Golden Key" to break the encryption of the "good" guys, thus making the communications of the "good" guys insecure. Fortunately, the Washington Post also ran the article: A key under the doormat isn’t safe. Neither is an encryption backdoor, which counters the assertion by Sen. Feinstein that "piercing" encryption would be helpful.

The mere existence of a "Golden Key" means that it can somehow be stolen or otherwise acquired by anyone. An unwritten law is that secrets leak. Once acquired by the "bad" guys, they will be able to break the encryption of the "good" guys. That means the "good" guys such as the banks may find themselves susceptible to hacking. Furthermore, as for the "bad" guys, why they will simply go to their Plan "B", the development of their own proprietary encryption. Thus the development and imposition of a "Golden Key" is a fools errand. To protect the "good" guys, unbreakable encryption is required.

The necessity for unbreakable encryption, even if it unfortunately means that the terrorists benefit, is a complex topic. For more details and greater insight, I will refer you to the TechDirt theme concerning encryption. Please read the posts of the people commenting on the various articles.  They will provide much more insight than I have provided.

A link to a variety of articles published in the Washington Post on the topic of encryption. As with the TechDirt article, it is also important to read the comments provided.

A link to an old, but still relevant, 1997 article from the Electronic Frontier Foundation: Decoding the Encryption Debate.

A post by Troy Hunt:  Security Sense: Encryption is a necessity that cannot feasibly be compromised.

Phil Muncaster writes: IT Body: 'Let’s Not Weaken Encryption in Wake of Terror Attacks'Mr. Muncaster quotes ITIC president and CEO Dean Garfield as saying:Weakening security with the aim of advancing security simply does not make sense.

Monday, August 3, 2015

The Multi-Year Science Fiction Magazine Database Extravaganza Endlessly Continues

I now have a rudimentary version of a Science Fiction Magazine Database that works through your (internet) browser. The benefit of this approach is that it works on your home computer and/or LAN, does not require specialized database software such as Base or MS Access, and (at some future point) will be internet ready. However, it does require that you have MySQL and Apache (LAMP/WAMP) operating on a computer. There is still much work to be done.

Opening Screen
The image above is the opening (main) screen.  From the opening screen one can do an author or story search. Additionally, one can display all magazine issues or filter by magazine.

Eventually, subsequent screens (as you can see below) will need to be redesigned to remove the Galaxy cover as it tends to be distracting in subsequent screens.

Note: The "Author" field contains a hyper-link. Clicking on it will display all stories written by that author.

Listing of Each Magazine Issue in the Database

My apologies to those reading other magazines. I have just been tracking Analog. The image below displays some of the F&SF magazines that have already been entered. Please be assured that this database can handle all magazines.

Note: The "Magazine Name" field contains a hyper-link. Clicking on it will display all stories for that particular issue.

Filtered Issue List

Selection of an Author

Display of all Stories Written by the Author Selected

Selection of a Magazine Issue

Display the Contents of the Selected Issue

Listing of All Stories Having "Pluto" in Their Title
Note: In the screen above there are hyper-links for both the author and magazine issue. Consequently, one can lookup either the author stories or the contents of the magazine issue.

This project still needs a lot of work. For example, I have not yet gotten around to developing editing screens. Turns out editing is much more complicated than simple data retrieval. Editing, when it is done, has been done through phpMyAdmin which works directly with MySQL.

Additionally, it seems that I will need to learn JavaScript for certain actions. Drat, yet another programming language to learn!!!

This project is essentially for my self edification. A more extensive and complete source of information is located at the Internet Speculative Fiction Database (ISFDB).

I anticipate that this browser based approach will be the final rendition in the development of this database. But then one should never say never. Eventually, I intend to complete it. But don't hold your breath.

Should you have any comments, please leave them.

Saturday, February 28, 2015

Backing-up MS Window Files to Linux on a Dual Boot Computer

Backing up files is one of those onerous tasks that has to be done if you want to preserve your data. Currently, I am running a computer using Linux (Ubuntu) with the capability to boot into MS Window 7.
I assume that many people will have a similar configuration and a need to preserve their MS Windows files. This narrative will review one approach for automatically accomplishing that task.

While most of my work is in Linux, I still have occasional need to boot into MS Windows and modify files that need to be saved. Yes, MS Windows has a backup program that can save your work. The problem, I have never gotten that backup program to function reliably. Next, my duration on MS Windows tends to be very short which circumvents the automatic scheduling of backups. Finally, the stored data is in a proprietary format and is not portable. Consequently I sought out an open source solution from the Linux environment that would accomplish and automatic backup.

As a quick aside, the back-up media that I am using is a Western Digital 2T USB hard drive that is attached to a USB port on my router. This configuration was chosen based on the premise that one should not use the same drive to backup your data. You would lose both your data and the backup should the drive fail.

When operating in Linux, there are a variety of back-up programs. Currently I am using sbackup. I have liked sbackup, but it has proven to be finicky. In this case, it appeared that sbackup was backing-up the MS Windows files. But that turned out not to be the case. The apparent "simple" solution failed. Time for Plan "B".

Plan "B" involved creating a "windows_backup" directory in my Linux home directory, using the Linux copy command, and employing anacron to schedule the backup.

The \etc\anacrontab entry to implement the backup script (program) is below.  
 1      20      window_backup    nice  bash /home/steve/ShellScripts/
Essentially the syntax above says to run the script (program) found in the file  "" once per-day 20 minutes after the computer boots into Linux (Ubuntu). Anacron manual page. The script below copies the MS Windows files into my Linux home directory and places them in the "windows_backup" directory. Sbackup successfully stores the files onto the Western Digital USB hard drive. No manual intervention required. Yea.
# Executed from /etc/anacrontab

cp -f -R -L "/media/windows/Users/Stephen/My Documents/Access" /home/steve/windows_backup/
cp -f -R -L "/media/windows/Users/Stephen/My Documents/My Garmin" /home/steve/windows_backup/
cp -f -R -L "/media/windows/Users/Stephen/My Documents/POI_Data_Files" /home/steve/windows_backup/

chown -f -R steve:backup /home/steve/windows_backup

As a conclusionary note, sbackup is only one of many Linux based programs to backup files. I am not that familiar with the other backup programs. Please do not consider my use of sbackup as an indication that it is the backup program that is to be used. You may wish to do your own search. Other Linux based backup programs may be able to successfully copy files from an MS Windows partition without the Plan "B" option noted above.