John Galea's Blog

My blog on Gadgets and the like

Adding EXIF data to your photos and Geotagging

There is a bunch of (meta) data written to your image files by your camera that is called EXIF data, includes everything from the date and time, camera settings, lens settings etc. But there are a things that neither my Canon T7i, nor my GFs Nikon P900 write that I wish it did.

While the Nikon P900 has the ability to Geotag photos (at a cost to battery life both when the camera was on and off), I was seriously disappointed that the T7i lacks this ability. I’ve looked at post processing solutions and found one called Geotag photos that works but was clumsy. Then I stumbled upon a simple solution. I use a Garmin GPS any time I’m exercising, and always figured there ought to be a way to take that GPS file and be able to post process add in the Geotag, well there is! Turns out a tool called EXIFTool can be used to take a standard Garmin TCX which you can download from Garmin connect, point it at a folder of images and magically the Geotags are added. It actually works! If you want to check the results there’s a great website Pic2Map that can then display the images on a map to let you verify that it worked correctly. I tried a Garmin GPX but the TCX got a more accurate location when I tried it. Here is the command to add the Geotags.

exiftool.exe” -P -overwrite_original_in_place -geotag=track.tcx *.jpg

I also wanted to add the author’s name and a copyright notice, well I discovered I CAN have the my T7i camera automatically add the owner info and copyright data into the images following this guide. To say this is less than obvious is an understatement. And of course, there are ALL the other photos I’ve taken before I discovered this hidden little GEM. And The Nikon P900 doesn’t seem to do it though. Using the same EXIFTool I can add author and copyright using the following command:

exiftool(-k).exe” -P -overwrite_original_in_place -Copyright=”All rights reserved” -creator=”Your name or email” -owner=”Your name or email” -author=”Your name or email” -artist=”Your name or email” *.jpg

The -P tells it to not mess with the date of the file, and the original_in_place keeps it from creating a backup of the file. You can also do raw files like CR2s if you wish. The creator tag is what shows up as the Author file in the EXIF data in Windows for Nikon, oddly for Canon it took the artist tag to make it work. Here’s what it looks like when your done. Here’s a complete list of EXIF tags. It is worth noting that social media sites like facebook strip most EXIF data so this is NOT a way to protect your work, for that you still need things like a watermark.

Next up, it would be nice to be able to add the add a comment that shows in Windows explorer, and yes this is searchable using the standard windows search utility! To do this I used :

exiftool -XPComment=”Raptor” IMG_1473.JPG

And this shows up:

January 22, 2021 Posted by | Uncategorized | Leave a comment

ThinkPad L480s model 20LTS39500

I recently got a new laptop for work, a ThinkPad L480s model number 20LTS39500 not that this is a new model, but thought I’d put it through the paces (you can query the model number is WMIC CSPRODUCT GET NAME). Ok, let’s start off with this is NOT a model I recommend for anyone. This is a business targeted workhorse. Let’s start out with physicals, this is a lovely thin somewhat light model

Dimensions (W x D x H)          336.6 mm x 232.5 mm x 19.95 mm / 13.25″ x 9.15″ x .078″
Weight                       Starting at 1.58 kg / 3.49 lbs

Display wise this is a 14″ 1366×768 display, so obviously this is NOT a high end display and it shows up front as an average experience in crispness and vividness, and this one is not a touch display, something that shocks me that they still sell non touch displays. Memory wise this can go up to 32G which is awesome. Power wise this is standard USB-C which is awesome and I’m thrilled to see this trend. Adding a USB dumb dock, or even smart dock are options. I have a few of these and love how conveniently one connection takes care of everything. I look forward to when this is the standard, not that Apple are likely to be embracing it anytime soon.

This particular model is completely undermined by a Seagate BarraCuda Pro ST500LM034 500GB 7200 RPM 128MB Cache SATA 6.0Gb/s 2.5″ physical drive. The fact that laptops today are even offered in ANYHING other than SSDs today is shocking to me. As fast as this hard drive is, nothing will ever compete for the silence, power, and random access of an SSD. That said here are the numbers, it’s a pretty fast hard drive. Using the system with 16GB of memory the hard drive positively NEVER shuts up. I really have become so spoiled by SSDs, it’s really hard to go back.

I quickly replaced the hard drive with an SSD, I went for a cheap, quick drive from Amazon a WDS500G2B0A. I used Clonezilla to block copy the old/new which was needed since the drive included SecureDoc encryption. In short order the drive was copied. The L480 is one of easiest systems I’ve taken apart in a while. The only complication was the thin ribbon cable that had to be removed to replace the drive. Once replaced I got improvements in both read and write, but more importantly in random access, oh and it’s silent! Ahhhh. These numbers are still around half of what specs say the drive is capable of “Sequential read speeds up to 560MB/s and sequential write speeds up to 530MB/seconds”. Once cloned Windows did have to reactivate so be sure you have access to you activation server (KMS).

This has the typical trackpoint and glide point arrange which I love about Lenovos. The keyboard as always feels good and the keys are where they belong! This particular model does not seem to be backlit on the keyboard.

Processor wise the one I got came with a Intel Core i5 8350U which is 4 core, hyper threaded. Memory wise this is standard DDR4 with two slots, so choose your upgrades carefully. Video is standard Intel 620 which is fine for business use.

The camera is 720p camera (required for facial recognition) with dual array microphone which is adequate for video conferencing which today with so many working from home is a must, and can be used for Microsoft Hello based facial recognition for sign in, assuming your company hasn’t disabled it.

Port wise there’s 2 x USB 3.1 Gen 1** (one Always On), 2 x USB 3.1 Gen 1 Type-C (Power Delivery, DisplayPort, Data transfer), 1 x HDMI, Micro SD card reader, RJ45 Gigabit ethernet, Audio headphone/microphone combo jack, as well as the standard kensington lock port. So well appointed and everything you will need, but if anything USB 3 or USBC can bring it to you at speed.

All in all this is a perfect corporate work horse. As mentioned earlier, I personally would NOT buy this model due to the rotating media and touch screen.

January 20, 2021 Posted by | Uncategorized | Leave a comment

Apple Homepod Mini

A while back I played with an Amazon echo a friend leant me. I liked it BUT, given I bought into the Apple ecosystem a while back when I moved from Android there is not a chance I was going to buy an Amazon echo. Each of the digital assistants have subtle different ways they need to be asked anything and the idea that I would need to adapt to talking one way to SIRI and another to Alexa was more bother than it was worth to me. Apple had a smart speaker called a HomePod that was astronomically priced $400, not a chance I was buying that, I’d give the money to charity and feel better for it first. Late in the fall Apple saw the issue and introduced the Apple HomePod Mini at $129 it’s a lot more reasonable but still expensive compared to the cheapest Echo at $55 regular price, making it harder to have multiple devices throughout the home. I’d have to be head over heels to keep this device let alone buy more than one. So with that, here we go … I pre-ordered one and it arrived just before Christmas. the device is an attractive, round black ball with a cable hanging out the back looking like a tail.

There is a small light on top as well as a volume controls. As a bizarre choice, Apple, the company that IMHO has been SUPER slow to accept USB-C has used a USB-C cable of fixed length hard tethered to the back of the HomePod that then plugs into a wall charger. If the cable isn’t long enough your stuck buying a USB-C extension cable or using a standard Ac extension cable.

Initial setup was classic Apple, brilliantly done. Once powered on, my iPhone saw the new HomePod and guided me to quickly getting it setup, adding it to my home app once labelling the room it’s in. To state the obvious, HomePods are really intended for people in the Apple ecosystem. I had never used the Home app before so had to figure out that is where the HomePod got put. Oddly, Home settings is where you can check to see if the HomePod is current (firmware). WIFI was setup from the iPhone. Once on your network the HomePod does respond to pings, but little else, it does not have a web server etc, in fact does not respond to the most common ports when I did a port scan of it. Once the initial user has the HomePod setup then comes the head scratching how do I add the other members of the house to it, since the HomePod is supposed to be able recognize numerous people’s voices. Well this was, to me, not obvious and took a trip over to Dr google who prescribed adding user to my Home? shrug … Ok well once that was done it was easy enough. Apple do not have anyway to add guests into your home for the HomePod, they are part of your home, or NOT. And if you have multiple homes, well that would be interesting to sort out, luckily (or not) I don’t have multiple homes 😦

Running for days on standby I saw a legendarily low average of a little over 1W, with it peaking at 5W during power on. This is really quite impressive, even music playback did not see this jump much at all. The light on the top, which for the most part is hard if not impossible to see, comes on when SIRI is listening for you to say something. Tapping on the top of the HomePod toggled play pause.

Invoking SIRI is a FLAWED approach. No matter what, no matter how far away you are from the HomePod, no matter how close your phone is, no matter what background noise exists, surely if you call up SIRI the HomePod, not your phone that’s right in front of you ought to answer? To say this is problematic is an understatement. I have no idea what Apple are thinking. This does mean your HomePod needs to be as centrally located as possible with no TV, radio or other conversations going on. The ONLY way around this is to manually call up SIRI on your phone rather than heh SIRI. Totally idiotic decision. I can only hope at some point the idiot that decided this gets fired, the decision gets reversed and the device with the clearest/loudest reception of Heh SIRI answers. If there is a reason I will return the HomePod, this just may be it, I just don’t do stupid, and definitely not at this price point. Moving on.

SIRI itself worked pretty well. Requests to add notes, add reminders all went properly to my phone as they ought to. Starting a timer got started somewhere in the cloud with no way to see how much time is remaining, another oddity of this device. Other SIRI request work pretty much as expected, assuming it can hear you. If it’s unsure who is speaking, it will ask, which it even did when only one person was setup on it?

So other than SIRI what other tricks are up it’s sleeve … Well it can play music, interestingly enough when it first started playing music, in spite of the iPhone having a local library the HomePod started playing some Tunein radio station. How it chose the radio station I have no idea. I later went back to the HomePod and unpaused it and once again some radio station, this time not even my language started playing. I can manually start music from my phone and direct it out the HomePod and this works and seems to go over WIFI Vs Bluetooth. So as long as you are on the same WIFI network it keeps playing. Sound quality is good with nice base, but this is a mono approach. For those with more money than brains you can choose to buy two and make it stereo, ya I have a great stereo, I don’t think so. I suppose a use case is you could take this HomePod wherever you want and listen to music that way, oh wait, there’s no battery in it, and the cable is pretty short, so that ain’t happening. Hmmmmmm. Oh and in case it crosses your mind to use your HomePod paired with ANYTHING other than Apple, ah that ain’t happening, there’s no way to pair say an Android tablet or phone, or a PC with the HomePod. Surely everyone in your home are part of the Apple ecosystem right? Oh, you can stream audio content from you Mac like any other AirPlay compatible device, so i guess there’s that (if I owned a Mac).

The HomePod can be used like a speaker phone for your iPhone so everyone in the house can join in on your conversation. For some this will be a killer feature, for others it will be oh no that idiot is using the HomePod and we get to listen to their call yet again. If you’ve ever worked in an open concept workplace speaker phones are a scourge that only the most inconsiderate TOADs use, that or the person that walks in say a conservation area using their speakerphone so you all can join in on the call rather than listen to nature. I digress … AGAIN.

Apple have included an intercom function into the HomePod, well sort of, it’s more like a walkie talkie than a two way conversation. From anywhere (in the house, out of the house) you can have the HomePod say something with your voice and that will be sent digitally and played. Then someone in earshod can say reply. It kind, sorta works.

In the movie Home Alone the hotel person get’s asked what kind of idiots do you have working here? Only the finest, and this HomePod clearly shows Apple dumbest group are working in the HomePod section of the business. Amazon and Google have nothing to fear and I can only imagine them rolling on the floor laughing at Apple entry into this space. So will I keep this device? To say I’m thoroughly disappointed is an understatement of EPIC proportions. If the ONLY thing I hoped for was a local SIRI device that looked nice, and drew little power then this device fits the bill, albeit expensively, but this is Apple and if you expected cheap, or even competitive pricing your delusional. Should you buy one, well that’s about expectations, if all you want is a kinda ok SIRI experience at a high price, have at it. Otherwise if your expecting another innovative product from Apple that will change and enhance your life, well this just might not be what you were hoping for.

If you want a ghetto version of this, take your old iPhone 6 that is of no use now, plug it in and voila you have a free heh SIRI device 🙂

December 30, 2020 Posted by | Uncategorized | Leave a comment

Nextcloud VM install (final setup)

I’ve played a lot lately with Nextcloud and Owncloud, it’s been fun. So let’s quickly review. I really like the functionality you can achieve with Nextcloud, and it meets my basic needs once all of the additional functionality like email/calendar etc that I don’t want are removed. I’m looking simply for a self hosted, safe, secure drop box like space where prying eyes like Google are looking into my documents. The Nextcloud iOS app includes iPhone photo sync which was the killer feature that moved me from Owncloud. There are a number of ways you can install Owncloud/Nextcloud from a container, to a VM. I have not been able to get the container space to correctly map outside the OS drive, or other containers so there is the risk of users overflowing the available space. This can be managed through quotas at the OS and inside Nextcloud/Owncloud but this isn’t what I wanted. From a VM point of view I found an all in one install script from nextcloud, but as with anything like this, while it gets you up as painlessly as possible, it also will have a particular preferred method. For me, I didn’t like that they used Postgres (I prefer MySQL/MariaDB), and it used ZFS (instead of LVM) for the data drive. At some point I may need to extend the partition so I want to make that as painless as possible. I also tried the Owncloud VM template and hated that because it used, and exposed Univention, a container web front end. And so we have what I hope will be my final setup. So what I am wanting is Front end Apache, secured by SSL, Nextcloud with it’s data on it’s own drive, running on Ubuntu 20. I will then port forward this to my domain, since I have still been unable to find a way to reverse proxy this using a separate NGINX instance. Here’s how to get there. I will repeat myself a bit in the hope of this article being all inclusive.

First off install Ubuntu 20, and install ONLY SSH. There is an option at install time for Nextcloud but I’m not sure what that selects or does. I made the OS drive 20G which once installed leaves 13G free. I added a second drive which you can make any size and expand it in the future if need be. To do this you do the following from the OS. Partition the new drive using fdisk:

fdisk /dev/sdb

  • select n for new partition
  • select p for primary
  • accept all default for partition starting etc
  • select t to change this to a LVM drive
  • enter 8e to make it an LVM drive
  • select w to write it
  • quit fdisk
pvcreate /dev/sdb1
vgcreate nextclouddata /dev/sdb1
lvcreate -L 49.9G -n data nextclouddata
mkfs -t ext4 /dev/nextclouddata/data
next up you need a mount point for the new space:
mkdir /nextclouddata (or whatever you want)
vi /etc/fstab to add the mount point at boot
/dev/nextclouddata/data /nextclouddata  ext4 defaults 0 0

You can reboot to see if it maps correctly. 

Your now ready to start your install. First up you need to install a LAMP (Linux, Apache, MariaDB, PHP). This guide was perfect. At one point it talks about installing Apache for PHP-FPM, you can ignore this part of the guide. Personally, I already have a MySQL instance in the house so I will simply use that, so I don’t need to install MariaDB. Now your onto the Nextcloud install itself, which this guide . The configured default space is a little odd in that it mounts as owncloud.example.com. So I manually edited the config file vi /etc/apache2//sites-available/000-default.conf to look like this:

Listen 80
<VirtualHost *:80>
ServerAdmin webmaster@ssl-tutorials.com
DocumentRoot /var/www/nextcloud
ErrorLog ${APACHE_LOG_DIR}/error.log
CustomLog ${APACHE_LOG_DIR}/access.log combined
</VirtualHost>

Your now ready to change the permissions on the mount point to allow Nextcloud to use this new LVM formatted drive for permissions.

chown www-data:root /nextclouddata
chmod 770 /nextclouddata

Your now ready to configure Nextcloud. hit http://ipaddress to get started. For me, I wanted to NOT install the standard code, wanted an external MySQL database, and wanted to mount the data to /nextclouddata created earlier. You do not need to do anything manually on the database, just give it your root or DBA account and it will create the database, create the account it will run on, and set everything up for you. It does NOT run as the DBA account you give it, so not to worry. Again remember to tick off the box install standard apps that’s just off this screenshot.

After a little bit of time Nextcloud is up ready for configuration. Logon with your admin account and configure the admin accounts email address by selecting settings. Then configure your email server settings, basic.

Be sure and test it out. Email is how your users are informed there account is setup, informed when something is shared with them, and how they can reset their own account. So this is a pretty important step to get done right up front. Next your going to want to go to the apps section and disable anything you don’t know you need. I also disabled the delete box to make sure anything deleted is gone immediately rather than be retained for 180 days or when the end user remembers to empty their deleted folder (sure they will remember ;)). I want the files on the server encrypted at rest so if Apache gets compromised, or SFTP gets hit the files are safe, so enable the default encryption module while your here. Your then ready to go to settings, security to enable server side encryption, read the warnings and move on …

I upload a file like a JPG into the file space, then use SFTP to go in and grab it out from under Nextcloud. If encryption is working correctly then this file will not be readable as a JPG, I like to test to be sure, as they say, trust but verify. Initial basic setup of the Nextcloud is now complete.

For me the next step is to add my own certs, but you could also use a self signed cert. The big issue with a self signed cert is users get an insecure warning and panic. So my own cert it is. You will need to add SSL support to apache by enabling the ssl module using the command a2enmod ssl and then editing your config file to point at your certs and change your port. You then need to go to your router to open up the port you chose. I know I’ve glossed over this last step but there are lots of guides and whomever you buy your cert will likely have an Apache config document. I bought mine from Positive SSL.

Update: I tripped over another great addition to my Nextcloud! using OnlyOffice I can add the abililty to create, and modify, spreadsheets, documents and presentations giving me Google Docs like functionality with local storage!. Installing it was pretty easy. First off you need to locate and edit your install.php file on your Nextcloud server to increase the timeout, seems OnlyOffice is BIG as called out in this post! Then it’s a simple two step process of installing the OnlyOffice plug in and then installing the document server as called out in this post.

I also discovered that Nextcloud supports WebDAV which can in turn be used to map a Nextcloud logon as a drive letter to a windows PC using windows explorer using add a custom network location. Which makes accessing your Nextcloud even more convenient, no web interface needed. And, anything saved is of course, encrypted at rest!

For a good giggle I took this and loaded it up on a VERY old dual core atom box with 4G of ram, and it ran perfectly fine. Encryption as you would expect was noticeably slower but otherwise …

I’ve been playing around with the right set of setting for properly backing up and iPhone’s pics. What I found is the best is to NOT select the default which is most compatible. That way live photos get left as HEICs. I also prefer to have them in their own folder on Owncloud so that the root doesn’t get cluttered, nor does your photo directory. And that way you can also have say another phone or tablet etc and have them all separate and not get merged. I also turned on maintain original filename to stop this from getting scrunched. With this I have an acceptable iPhone backup, off of iCloud with no limits but my own!

December 25, 2020 Posted by | Uncategorized | Leave a comment

NextCloud container

OpenSource is a collective of coders that get together for a project. Eventually, a parting of the ways happen and a group of those coders go there own way sometimes creating a new version of the project they were working on, referred to as a fork. Well, I previously wrote about Owncloud which did exactly what I wanted, provide a safe place for me to drop files for others to come and get without the prying eyes of cloud providers or governments. Not that I’m doing anything untoward, but it’s more about privacy. From the start my friend Lance told me to skip Owncloud and go to Nextcloud. When I first loaded Nextcloud I hated it. Way too much loaded, way too busy, way to complicated to hand to a non technocrat and expect them to be able to know how to drop a file so I went with Owncloud. And then I saw something about mobile sync and discovered Nextcloud iOS (iPhone) app supports photo syncing. I’ve long been irritated by Apple’s ransoming of my photos and the continuous nagging about iCloud being full … buy more or else. Sadly, Owncloud’s iOS app does not support this, and so I decided to have another look at Nextcloud. I have no need of Mail/calendar/chat and all kinds of other clutter Nextcloud loads up so the best place to start is at the install. I’ve decided to go with a container for the initial install, and I chose Ubuntu 20 as the host. I decided a separate host to play, and to segregate so that someone filling up my space doesn’t bring down my entire container host. While this could have been handled other ways this was how I went forward and why.

To cut to the chase, here is what I found as improvements of Next over Own:

two factor authentication support, forced at a system level or at a user level
photo sync on iphone support
you can set a default users password for them let them choose their own
notifications when something is shared with you

To deploy the container I used: (this container includes Apache, and Nextcloud).

docker run \
–name=nextcloud \
–hostname=nextcloud \
-p 192.168.2.223:8080:8080/tcp \
-p 192.168.2.223:80:80/tcp \
-e VERSION=latest \
-e TZ=”America/Montreal” \
-v nextcloud:/var/www/html:rw \
–restart=always \
nextcloud

This exposes both 8080 where I intend to publish an SSL secured site on as well as 80 for initial setup. I decided I’d use my already setup mysql backend. Nexcloud requires no prep of mysql, just give it the root or DBA account and it does everything for you. It will create the account it runs on, creates the DB etc. For me, one of the keys to tolerating the clutter of Nextcloud is on the opening screen where it says install default apps, um no thanks, it’s a tick box just out of view of the next image.

And with that it’s installed ready to be configured. I recommend an admin account called something other than admin, or administrator, too obvious. The next steps are very similar to Owncloud but I’ll replicate them anyway. First off set your admin’s email account (settings personal info) and set the smtp server so emails can be sent to users when their accounts are created (settings, basic settings). It also allows them reset their own passwords. Since Nextcloud allows you to set a users password when your creating it, this isn’t as critical as it was in Owncloud, but none the less, might as well get it done. Next up was to dramatically simplify the Nextcloud clutter by disabling what I don’t want from the apps. I removed dashboard, weather, status etc and cut it to the minimum. You can always put stuff back if you need to, or ever want it. I even disabled the disabled files so end users files are gone right away once deleted. I enabled the default encryption, and then turned on encryption. This insures that files are encrypted at rest. I double check this by downloading a file using SFTP out from under Nextcloud to ensure it’s unreadable.

With this you have a basic setup, but it isn’t ready to use since there’s no SSL. Fortunately apache is part of the container so it’s pretty easy to setup. Unfortunately they did not include the SSL module but this is pretty easy to fix. So to get all this done I manually customize the files and then copy them into the container using:

docker exec -i nextcloud a2enmod ssl
docker stop nextcloud
docker cp to copy all cert files somewhere in the container you can then reference
docker cp 000-default.conf nextcloud:/etc/apache2/sites-available
/000-default.conf
docker start -i nextcloud

and with that you have SSL enabled. I’ve not yet figured out NGINX reverse proxy, so for now I just open 8080 as SSL to the Nextcloud ip. I’ve been using VEEAM to backup my VMS, but I also grab a number of key Apache config files and do a database dump on mysql using the command:

docker exec mysql mysqldump –user=root –password=password nextcloud > /home/movi
es/nextcloud.sql

To update nextcloud I use the following commands and then call the create commands I started with. You will need to reconfigure apache as well using docker cp as above:

docker stop nextcloud
docker rm nextcloud
docker rmi nextcloud
./nextcloud-create (shown above)

So with this Nextcloud is now up. There’s really two things I’m not fond of with this setup, updating is little more complicated, but the above process works, but the biggest concern is I have not been able to get mapping the container space outside the OS drive, so you could have a situation where drive fills up and bring the OS drive down or paralyzed.

December 25, 2020 Posted by | Uncategorized | Leave a comment

Owncloud take two … and quickstart guide

I previously looked at the Owncloud template VM and found Owncloud does exactly what I want but hated the way the template was implemented … At the risk of repeating myself I will make this post stand alone by including the intro …

From time to time I need to exchange sensitive files with people like lawyers that themselves don’t have secure file drop faculties. At this point I am uncomfortable dropping anything even remotely sensitive on pretty much any of the clouds. Call it paranoia, call it informed, you choose, but ever since the US passed the CLOUD act back in 2018 which made it easier for governments to get access to your data irrelevant of data residency, I’m more and more concerned about my data. Now personally, I don’t think anything I am doing warrants (pun intended) attention, that’s not really the point. So I decided I would look at self hosted cloud space. Somewhere I can put files and have others easily grab them.

I looked at the Owncloud container but ran into a number of issues that stopped this being my choice. Owncloud has removed the web interface expecting it to be reverse proxied. But I found documentation on how to get NGINX to reverse proxy an Owncloud container, whether on the same container host or not to be unhelpful and I burned a lot of time at this. I’m also a tiny but concerned that the Owncloud container could be fill my container host and compromise all of my other containers. There are ways to contain this, but without a working reverse proxy solution I was dead in the water. I also had issues deploying the So with that I went ahead and followed a guide to installing Owncloud on a dedicated Ubuntu VM. This is the simplest way to implement owncloud. So I installed Ubuntu server, and followed the guide. It actually was pretty complete. There were a few steps left to do, I wanted to secure the front end with SSL, but this was pretty well documented on a number of sites, just google install cert (which I already had) on Apache. Because I was unable to get reverse proxy running I simply port forwarded an externally unused port to this Owncloud server, port 8080. Your going to want to get this all setup, tested and working before you move onto the next step. What you end up with is Ubuntu 18 (which isn’t the most current), Apache, and then Owncloud is installed as a web application inside

Initial setup wizard guides you through choosing a back end SQL server and setting up an initial admin account. At this point you think your done, but the wizard and even other guides fall down at this point and it’s confusing what’s going on. I tried, unsuccessfully to setup a MySQL back end server on a different machine only to discover this is a common issue, but this turned out in my case to be a permissions issue. I had initially set it up with SQLite3 only to discover this has virtually no security, I guess it’s why they don’t recommend it 😉 So in the end, give Owncloud your root DBA account and what it will do is create the database, create and account and password and configure Owncloud to use it.

So, here are the next steps to getting Owncloud usable. First up you need to setup the email account for the admin account. This MUST be done and you can’t go further until it is. It’s a simple step, logon to the admin account, admin, settings, general, add your email address and click set.

When you create a new account within Owncloud it send the end user an email to set their initial password. If email isn’t setup this goes no where, the account is created, and you can’t use it. Something that I didn’t find was well explained. So … Next up setup email, settings Admin general and test. You can not get anywhere further until these two steps are done.

It’s important that your end state, reverse proxy, port forward or whatever you choose is up and working at this point, because Owncloud will use how you have logged on to create a URL for an end user to setup their initial password. Especially important for people not on your internal network. At this point I recommend you create an account for yourself to be able to test out what the link end users will get, and follow to set their initial password. I created a group first off I call users. Go to users then Add group. Then

Now your ready to create a user:

There is a way around this, set the email address to yours, create the account, and set the initial password by following the link, later changing their email. Email is also how end users can reset their own passwords.

By default Owncloud holds onto deleted files for a retention period to allow users to change their mind. Expecting end users to clean out their trash bin is optimistic at best. so to get around this you can change the retention period by following this article. Or better yet, you can disable the deleted files app and files are immediately deleted, perfect!

Expanding the drive space should your needs grow can be done in one of two ways, by adding a second drive and changing the mount point by following this article or you can also extend the volume group.

By default files are stored unencrypted in the web path. So a breach of Apache would leave these files vulnerable. I didn’t like that thought so turned on encryption at rest. Files that are take out from under Owncloud are encrypted and useless. Not to say it can’t be unencrypted but it’s going to take some work. To do this you have enable the encryption module then enable server side encryption, but you have to click show disabled apps to see and enable encryption.

It’s always important to backup your main config files, they are: I used this script to back it up:

cp /var/www/owncloud/config/config.php /home/backup/owncloud/config.php
cp /etc/apache2/conf-available/owncloud.conf /home/backup/owncloud/owncloud-conf_date +"%Y%m%d".conf
cp /etc/apache2/sites-available/000-default.conf /home/backup/owncloud/default_conf_date +"%Y%m%d".conf
cp /etc/apache2/ports.conf /home/backup/owncloud/ports_conf_date +"%Y%m%d".conf
mysqldump –user=ownclouduser –password=xxxxxx owncloud > /home/backup/owncloud/owncloud-dbbackup_date +"%Y%m%d".sql

With this I have exactly what I wanted, my needs are quite modest. Even with encryption it takes just 1G of memory and 1 VCPU with little impact to upload speeds, I tested it with a 1G file. I love that Owncloud even preserves the original file date!

December 15, 2020 Posted by | Uncategorized | Leave a comment

Owncloud VM template

From time to time I need to exchange sensitive files with people like lawyers that themselves don’t have secure file drop faculties. At this point I am uncomfortable dropping anything even remotely sensitive on pretty much any of the clouds. Call it paranoia, call it informed, you choose, but ever since the US passed the CLOUD act back in 2018 which made it easier for governments to get access to your data irrelevant of data residency, I’m more and more concerned about my data. Now personally, I don’t think anything I am doing warrants (pun intended) attention, that’s not really the point. So I decided I would look at self hosted cloud space. Somewhere I can put files and have others easily grab them.

I looked at Owncloud first. It is available as an appliance in many formats, for me I chose VMware. The VM is configured with 1 CPU and 2G of memory so pretty light. Upon first boot, your guided on the console of the OS through setup which makes it pretty easy. The system is updated upon first boot.

The VM consists of Debian based OS build with Univention portal which manages users (among other things), and docker installed. Owncloud is installed as a docker container. Apache is installed in the base OS, so any work with Apache, such as SSL is done in the VM, not the container. Initial logon at the web root brings you to Uninvention interface.

Logging into Uninvention allows you to setup users. You can setup users within the Owncloud as well, but it easier and complete to set them up within Uninvention. From here you can configure the users, quotas etc.

File systems are setup as volume groups so they can be expanded as your needs grow. Out of the box there is 41G free, more than enough for my needs. The VM is set as a thin disk so will take min amounts of space.

Within the Uninvention interface you can also check for updates to the OS, and the container. There is also a console access complete with SSH to the host OS.

The mount point for the container has not been nicely mapped so your in for a long path to get to the containers data space. Logon to the container itself is done using the traditional docker interface should it be needed, which isn’t likely. This same interface is not hardened (you can even logon with root) and so provides a point of attack.

Users will never use the Uninvention portal, they will logon to the OwnCloud interface which is at url http://machinename/owncloud/ Out of the box port 80 and 443 are both enabled, and it creates a self signed cert, both of which you will not want to leave, we will address this shortly. Unfortunately the admin console for Uninvention is available externally as well as the cloud interface, wouldn’t be my choice, I’d prefer administration was done ONLY over the local network. Additional applications can be loaded onto the Uninvention portal. All in all it got me up quickly but I was unimpressed with the large attack threshold and abandoned this approach, but not Owncloud itself. See next posts for what’s next!

December 14, 2020 Posted by | Uncategorized | Leave a comment

Ghetto NAS

A long time ago I bought an Asrock ION 330 which I used as a media player for a while. It’s a small, low energy, quiet, device. It has 4G of RAM a 1G wired ethernet and a dual core atom.

Normally it has one DVD and one 2.5″ drive but I was able to hack in two 2.5″ drives. This makes it perfect to make a ghetto NAS device. So I installed Windows 10 onto it, created a share and le voila we have a ghetto NAS. So my intention to use this is to dump critical backups to once a month or so. It only draws 35W when powered on but since I want to dump to it only occasionally so since Windows power management is good, and this device is completely supported I’ll put it to sleep when not in use. I can direct everything remotely so this will work perfectly and draw next to no power. Using a power meter I found hibernate drew the least power, and sleep drew almost as much as active. So hibernate it is. First off you need to enable powershell remote commands using this powershell command on the NAS:

Enable-PSRemoting

To put it to sleep I used this command:

powershell Invoke-Command -ComputerName 192.168.2.217 -scriptblock {start shutdown /h}

The trick in this one was to use the start command otherwise the command never came back. This works perfectly. This device supports Wake on lan, but not wake on lan from power off, which I why I use hibernate. To wake it up I use a command wolcmd. It works perfectly.

To do the copy duties I use the built in robocopy tool, part of Windows! To harden it up a little I went into the firewall settings and disabled those things I don’t need incoming. And with that, I have a free NAS that draws very little power!

December 10, 2020 Posted by | Uncategorized | Leave a comment

Unravelling iPhone photos

More and more we use our phones to take pictures, but as you do a large mess begins to get started that Apple have made difficult to unravel. First off let’s talk about how an iPhone (or iPad) manage photos. By default, the phone backs up your photos to the Apple cloud, icloud. Sounds good. But unknown to you, the original is no longer stored on your phone. Over time, next thing you know, you are getting nagged that your iCloud storage is full and you need to pay for more. Hmmm this sounds self serving. So I have managed this by simply deleting photos on the phone and this makes the nag go away for a little while and then it comes back again and again. So I decided I would take a look at solving this ongoing problem, because I don’t want to pay for more and more storage on Apple’s cloud, it just irks me. By the way, Apple explain all this if you read …

I looked into a number of apps that would run on the phone and use my own storage and that’s when I bumped into the issue that the photos are actually not on the phone. Amazon photos tried to deal with this by downloading the photos and then uploading them to Amazon. As a prime member I have unlimited storage. For whatever reason, this just kept failing enmass.

So I headed over to iCloud and decided the first thing to do is download the photos I have and then I can delete them from iCloud. This is easier said than done. First of all they only allow you to download 1000 at a time, I had over that. Second, once downloaded all the dates on the files are the date you downloaded them, loosing the date taken. This is discussed all over the place. To fix this issue I found a fantastic free tool called jhead that will replace the date on the file with the EXIF date taken field. It works like a charm by using the command:

jhead -ft *.jpg

This worked almost perfectly, except on the occasions the file was downloaded and there was no EXIF data.

Ok so now all the files are downloaded, date corrected so now I can delete them from iCloud. Well Apple want to be 100% sure you did what you wanted so deleted files are moved into a recycling box for 30 days. So you will need to go and clean them out of the recently deleted. The iPhone continued to keep the photos I’d deleted, I assume this would eventually sort itself out but I wanted this resolved NOW so I went ahead and turn photos off from the icloud settings and all photos were deleted and cleaned up. Finally the mess is untangled.

Now, I can download Amazon photos with a clean slate and go forward with UNLIMITED storage (as long as I pay for Prime) and NEVER be nagged by Apple about buying more space again. I am shocked how complicated this was, and I can totally see why people just give up and pay Apple.

There’s a point worth noting, Apple brought about it’s own image format to support live photos. Live photos are a series of frames taken that can then be manipulated within the iPhone photos app. Manipulations include things like loop, bounce, long exposure (etc). Downloading images on a PC looses these frames and all you get a MOV of the frames and a jpg of whatever manipulation was used on the photo. It MIGHT be different if these were downloaded on a Mac.

November 13, 2020 Posted by | Uncategorized | Leave a comment

Canada’s COID alert app

The federal and provincial governments are encouraging us all to load this app onto our phones. A number of employers have gone to far as to mandate it be loaded on corporate devices. So what is this app, how does it work? It uses bluetooth and signal strengths to record and estimate who has been close to you. If a person tests positive for COVID they inform the app and it in turn informs people that have been close to the infected person. Sounds good, but let’s get into some details … The app has been heavily focused on people’s privacy. This means the data in the app, and what it shares with you is severely limited. Ok, so you get an alert from the app, oh crap now what? Well here is what you get informed.

So what you get from this is that some time in the last two weeks, you were near someone, somewhere for 15 minutes or more. Ok well that’s pretty vague. So now what? Well you need to go get tested according to the app. So you go online, find a testing center near you get screened by a doctor over the phone, book an appointment and go get tested. Then the waiting game starts. All the while you are encouraged to self isolate. But wait, who do you inform? Your employer, anyone you have been near in the last two weeks? And do they self isolate? Well at the testing center they tell you you will be contacted if your positive or your results will be online if negative. So you check … and check, and even if the results are returned quickly it can seem like an eternity. And in Ontario the web site takes a LOT of patience as it asks you to type a lot of data to prove you are who you say you are. And one mistake and your typing it all again. And anyone you told is now themselves concerned. And remember, just because you were near someone for more than 15 minutes that has since tested positive it doesn’t mean you are infected. Let’s not forget that if we are being smart we are wearing masks, washing hands and social distancing, something the app has no way of knowing.

And finally the results come back, negative, phew. Now what? Are you past the possible incubation period? Remember you have only a vague reference in time. And as infection rates rise, especially for people in certain roles, this app is going to trigger frequently.

So all in all, I have to say the app is a good idea in principle, but given it’s current lack of refined data back as to when you were near this infected person it’s usefulness is somewhat limited. And I wonder how many people are going to get needlessly tested, and be put under added stress unnecessarily awaiting test results? And once this app alerts once, on then second alert will you go get tested again? Or will you give up and uninstall it?

So as you may have already read between the lines, yes this happened in my household.

October 29, 2020 Posted by | Uncategorized | Leave a comment