Installing a Self Signed Cert on your Local Machine

As browser security increases in 2021 it’s gotten a lot harder to install a self-signed cert on your local dev server and get a nice pretty dark gray lock (I kinda miss the green). It’s very important, especially if you are working with web workers and javascript. Web browsers are continually tightening down the security and you no longer even accept TLS 1.0 and 1.1 anymore, so keep those certs and virtual host configuration and web server SSL settings up to date! This is something you have to do every time you work on a new domain and you won’t have a cool tool like Letsencrypt and Certbot to use because they do a round trip on the server which it means it has to be publically exposed and available online which is something you should never do with your local development server. Since it took all day to figure it out I thought I’d write an article on it.

Privacy Error, Not Secure local Dev website
Local Error for Self Signed Cert

The process is pretty straightforward.

  1. Generate a key and certificate with Openssl.
  2. Specify the key and certificate in your Web Servers configuration file
  3. Restart web server
  4. Export certificate from browser
  5. Import that certificate into the browser
  6. Restart the browser

I set this up on my Windows machine with Git Bash and Openssl running Apache Web Server and using Chrome as my browser, but the steps are the same if you are using a Linux machine, and the web configuration will change if you are using a different web server or depending on where you put your website or generated your cert/key. The web server will always have a way to specify a key and certificate for https.

One of the major changes I had to do that I didn’t have to do before was use a configuration file for openssl. You can find it here at minimal.cnf and it looks like this:

prompt = no
distinguished_name = req_dn
x509_extensions = x509_ext

[ req_dn ]

commonName = www.websitedomain.local

[ x509_ext ]

subjectAltName = @alt_names

DNS.1 = www.websitedomain.local
DNS.2 = websitedomain.local

minimal.cnf file (change commonName and DNS.1 and DNS.2 to your local domain names…I believe you can also specify an ip as IP.1 if you want under alt_names)

The openssl command is as follows….once again, change the names to something appropriate for your website…these are the files that are referenced in your virtual host file. I set it to be good for less then 3 years because I believe there is some sort of max for security reasons, at least in Chrome.

openssl req -newkey rsa:4096 -x509 -sha256 -days 1000 -nodes -out www.websitedomain.local.crt -keyout www.websitedomain.local.key -config minimal.cnf

My Apache configuration file for this virtual host in httpd-vhosts.conf looks like this:

<VirtualHost *:443>
    ServerAdmin [email protected]
    DocumentRoot "C:/websites/www.websitedomain.local/public_html"
    ServerName www.websitedomain.local
    ServerAlias websitedomain.local
    ErrorLog "logs/www.websitedomain.local-error.log"
    CustomLog "logs/www.websitedomain.local-access.log" common
    DirectoryIndex index.php index.html index.htm
    SSLEngine on
    SSLCertificateFile "C:/websites/www.websitedomain.local/ssl/www.websitedomain.local.crt"
    SSLCertificateKeyFile "C:/websites/www.websitedomain.local/ssl/www.websitedomain.local.key"
    <Directory "C:/websites/www.websitedomain.local/public_html">
        AllowOverride All
        Options  Indexes FollowSymLinks Includes ExecCGI
        Require all granted

Don’t forget to make sure your hosts file has an entry for your local website domain. www.webmastedomain.local

[Of course change “websitedomain” to the domain you want to setup and access the website through…it’s arbitrary, but needs to be consistent…make sure it’s the same in the configuration file minimal.cnf, hosts file, virtual hosts file and in the url you are trying to access the website through.]

Now comes the tedious part where you export the certificate through the browser and then import the slightly modified certificate as a Trusted root certificate.

And finally, don’t forget to close and reopen your browser before checking the url and it should be good for 1000 days! You can try longer, but I don’t recommend it since security requirements change all the time with browsers and web servers.

Successful Installation of Self Signed Cert for Local Dev Server
Successful Installation of Self Signed Cert for Local Dev Server

Installing Zabbix Agent 4.4.1 on Ubuntu

Install the same agent as that on your server (in this case Zabbix 4.4.1….you can see the releases by hitting the release folder with your browser at


add the package you just downloaded

sudo dpkg -i zabbix-release_4.4-1+bionic_all.deb

sudo apt update

Install the agent

sudo apt install zabbix-agent

Configure the agent by editing the agent config file….

vi /etc/zabbix/zabbix_agentd.conf

edit the IP and the Host name

Server=IP address of Zabbix Server

Hostname=Hostname of client PC

Restart and enable the agent

sudo systemctl restart zabbix-agent

sudo systemctl enable zabbix-agent

Verify the agent is up by checking the status

systemctl status zabbix-agent

and then of course, go to your Zabbix front end, add the Host and see the results!

Troubleshooting: Make sure the port 10050 (that’s the default…or whatever port you configured if you changed it) is open for pushing data to the Zabbix server. Use nmap because it’s awesome.

nmap -p 10050 [zabbix server IP], so for example nmap -p 10050

Keep Your Ubuntu Server Clean and Up to Date!

Just use these simple commands, in this order.

sudo apt-get update

sudo apt-get –with-new-pkgs upgrade

sudo apt-get dist-upgrade #updates linux kernel

sudo apt-get clean

sudo apt-get autoclean

sudo apt-get autoremove

For most setups this will work fine. There are other tools you can use ucaresystem, localepurge, gtkorphan, bleachbit, but these commands cover the basics of keeping your server up to date and go a long way towards covering basic server maintenance and keeping your system clean and up to date.

Easy Amazon Wishlist 2.0 WordPress plugin goes live!

Easy Amazon Wishlist 2.0!

Ever since Amazon retired their wishlist API it’s been difficult for customer to find a convenient way to display their Wishlists (and now Idea Lists as well!) Wishlists are great for non-profits, social media stars with loyal customers, birthday, or for whatever reason you might want to provide a mechanism for someone to buy you something. We also added the capability to add an Amazon Idea list to your blog. These are ideal for recommending products on Amazon. We are going to work on a Facebook app to do something similar now, and we have great plans for the future of the plugin, but we have to prioritize, so please let us know what new feature for the Easy Amazon Wishlist pugin you would like to see worked on first!

Coming Soon
What new feature do you want worked on first?
What new feature do you want worked on first?
What new feature do you want worked on first?

Incomplete WordPress Install

How to manually add a WordPress Admin

Well, we ran into an interesting problem the other day where we were able to install WordPress on a Windows 8 laptop for local development, but it only partially installed and stopped at step 2.

We hit refresh on the page, which resubmitted the login details from the first step and it takes you to a screen that says WordPress has already installed.

Now there are a couple reasons this error can come up…most likely because of a PHP server side timeout error…unfortunately the second step it failed on is the step where it adds the admin user to the WordPress install, so if you click the “Log in” button it takes you to the wp-login.php page, but you don’t have a user to login with, and if you go to the database and look in the users table you can see that it’s empty….so your options are to keep futzing with the php.ini file and restarting Apache until it installs correctly (and for that you just delete all the tables in your WordPress database and hit the /wp-admin/install.php file again), or we can install the admin user manually in the database. No manually adding a user to a WordPress install involved modifying the users, usermeta, and options table (in your install they will probably be called the wp_users, wp_usermeta, and wp_options tables or whatever your prefix might be that you’ve set in your configuration file). So in your MySQL database (or MariaDB I suppose) you want to run the following SQL queries to add rows to the relevant tables. Change “prefix” to whatever value you specified in step 1 for the wordpress database prefix. You can also find it in your wp-config.php file under $table_prefix = ‘prefix_’;

INSERT INTO `prefix_users` (`user_login`, `user_pass`, `user_nicename`, `user_email`, `user_status`)
VALUES ('newadmin', MD5('pass123'), 'firstname lastname', '[email protected]', '0');

INSERT INTO `prefix_usermeta` (`umeta_id`, `user_id`, `meta_key`, `meta_value`) 
VALUES (NULL, (Select max(id) FROM prefix_users), 'prefix_capabilities', 'a:1:{s:13:"administrator";s:1:"1";}');

INSERT INTO `prefix_usermeta` (`umeta_id`, `user_id`, `meta_key`, `meta_value`) 
VALUES (NULL, (Select max(id) FROM prefix_users), 'prefix_user_level', '10');

Make sure you specify your prefix in prefix_capabilities in the prefix_usermeta table, and prefix_user_level in the prefix_usermeta table (and notice the prefix_capabilities, and prefix_user_level above as well). On a side note, if you have multiple wordpress installs in different directories on the same website make sure prefix_user_roles in prefix_options has the right prefix. Let us know if you have any problems with these directions and we’ll see if we can’t point you in the right direction.

Dusting off the cobwebs

So it’s been awhile since I blogged anything and so much has happened….recently I moved servers from a managed server to a Linode server because I felt like I needed the root access and the responsibility of keeping up my own server. So in my last job I got a lot of experience maintaining my own server (really several servers with MySQL replication setup) and learned a lot about server security and running my own websites. So why Linode? Well, it was recommended to me years ago by a knowledgeable friend and after doing some research on pricing and server stats…how many cores, what kind of CPU, how much throughput per month and will it scale at a reasonable price when traffic increases. I chose Ubuntu for my OS since that’s what I’ve gotten comfortable with and it has huge community support and has been rock solid for me so far. I installed the LAMP stack using this LAMP install guide, because what you get is very bare bones and you are really building up the server yourself. Of course you need to lock down the server fairly quickly because most applications start out fairly wide open when first installed. This Linode security doc was a great help in reminding me what aspects to secure first. First create a standard login user with SUDO access (SUDO access allows a regular user account to act as root)….this is the account you will login with and upload/download files with. Also add it to the same group as www-data user (what the APACHE thread runs as). This makes it so that Apache can access the files and directories you’ve uploaded, and also has the handy feature of making it easy to see what files you’ve uploaded and what files have been created by your server scripts, or uploaded via Apache (which is one of the most vulnerable points of any server).

So moving servers was a real pain…I upgraded my PHP to 7.2 (really wanted the 50% increase in server code execution speed this is supposed to give you), upgraded MySQL servers to 5.7.25, APACHE to 2.4.38…upgrading everything gives me increased security from having all the latest bug fixes and security patches, but also called for a lot of code changes. Also the same reason I upgraded from Ubuntu 18 to Ubuntu 19 which was an adventure in and of itself. This was the
Ubuntu upgrade guide I used, and it came out pretty smooth. Before I upgraded I also had to install sendmail (so my php programs could “mail”). There are multiple mail servers to use, and I will probably use a more advanced one for when I create email accounts with my domains, but sendmail is nice just to get my contact forms back up and running and sendmail is rock solid.

Upgrading to PHP 7.2 (and really any pre-7 to PHP 7) primarily consists of updating the MySQL calls…now ideally I would want to take my current custom code and implement a shiny new frame work like Laravel which has all the code updates for PHP 7, BUT it was quicker to just update my current code base. So my SQL queries were fairly basic, nothing too fancy, so they remained the same and you are just changing these mysql functions with these mysqli procedural versions of the functions and you just search and replace strings like “mysql_query(” with “mysqli_query($connection,” ….the caveat being you need to make sure all your function calls have access to $connection…also mysql_error() with “mysqli_error($connection)” (and/or mysql_error($connection)), etc. Not easy. Check the mysql logs to see what errors are being thrown and fix as you go until all the pages are compiling and no errors are appearing in the logs.

So some other big things I did was move my project code to Bitbucket and Git instead of using Github and SVN, upgrade my very, very old WordPress websites to the latest version including updating the PHP code to 7, installed free SSL certs with LetsEncrypt, installed PHPMyAdmin, Webmin, Fail2Ban Firewall, and put everything behind Cloudflare, and setup local development environments which I will detail in later articles.

The 2 best things since sliced bread

Not going to talk a lot here, but I wanted to share 2 quick things. If you are not using Cloudflare…..use it. It’s a CDN, but also so much more. It can protect your site from attacks and has a handy firewall, also a handy interface to handle your DNS, some nifty analytics, and can also handle your SSL cert to make it dead simple to go https. One of my favorite things is the speed optimizations where it can automatically minify your CSS and JS as well as do some very cool image optimization. Best of all it’s free for small companies and cheap for large….I use it on all my work sites and when I get some time I’m switching this site to it. They’ve also just implemented a new geocentric load balancer that I’m excited to try.
The other things I wanted to mention was this site HTACCESS Tester. If you’ve ever had to deal with an htaccess file, and who hasn’t if you’re doing serious web dev for SEO, you know it’s a pain in the ass….it can do remarkable things w/url rewrites and 301’s, BUT it can be incredibly evil as well. You probably already know about the tag you can insert into your virtual host configuration to turn on logging for htaccess (LogLevel alert rewrite:trace6), but that log is waaaay too much info, especially with a large .htaccess file. Enter htaccess tester. Now you can write a rule and check if it gets applied the way you think it will…seriously, it’s a life saver.

Dual Pivot Quicksort Kicks Butt!

So while I was checking out different flavors of quicksort after taking the Coursera Algorithms class I found out about a new version of quicksort that was much faster then I had anticipated.  Java 7 had recently modified their sort function to use a new algorithm called Dual Pivot Quicksort, so I thought I’d check it out.  I found a java version and recoded it to be javascript.  It works well, but currently only sorts integers and in ascending order.  Ideally I should modify it to work the same way the normal array sort algorithm works, that way I’d be comparing apples to apples.  I believe under the hood that the V8 javascript engine has upgraded to the new dual pivot algorithm, but it does have some additional checks and improved functionality where you can pass in the sorting algorithm, which may be why it’s consistently slower then my javascript version of dual pivot quicksort that I coded up.  It’s consistently 10%-50% faster then the default array.sort() algorithm.  The default sort is going to work well enough for 90% of your sorting cases, but if you are following the 10/90 rule and have narrowed down the bottleneck in your code and want to optimize sort, then I would advise taking this code and modifying it for your needs since it will always be faster then the default array.sort().  Check it out on JSPerf

Technically JS Perf says it’s ±10.25% faster then the default sort the last time I ran it, but it does vary quite a bit. My guess is that it probably varies based on how random the shuffle is.  I suppose it could be because we are comparing a lion to a cheetah….the default sort sorts any object, and lets you pass in the sorting function, while the dual pivot quicksort I coded up only works on integers and in ascending order, but I’d still say it’s worth looking into, ESPECIALLY if whatever language you are coding in hasn’t rewritten their quicksort to use the new algorithm.  I made it a single variable module, and you call it with DualPivotQuicksort.sort(arr). Here’s the code.

/******** Dual Pivot QuickSort ***************/

var DualPivotQuicksort = (function() {

    var dualPivotQS = {};

    dualPivotQS.sort = function(arr, fromIndex, toIndex) {
        if(fromIndex === undefined && toIndex === undefined){
            this.sort(arr, 0, arr.length);
        } else{
            rangeCheck(arr.length, fromIndex, toIndex);
            dualPivotQuicksort(arr, fromIndex, toIndex - 1, 3);
        return arr;

    function rangeCheck(length, fromIndex, toIndex) {
        if (fromIndex > toIndex) {
            console.error("fromIndex(" + fromIndex + ") > toIndex(" + toIndex + ")");
        if (fromIndex < 0) {
        if (toIndex > length) {

    function swap(arr, i, j) {
        var temp = arr[i];
        arr[i] = arr[j];
        arr[j] = temp;

    function dualPivotQuicksort( arr, left, right, div) {
        var len = right - left;

        if (len < 27) { // insertion sort for tiny array
            for (var i = left + 1; i <= right; i++) {
                for (var j = i; j > left && arr[j] < arr[j - 1]; j--) {
                    swap(arr, j, j - 1);
        var third = Math.floor(len / div); //TODO: check if we need to round up or down or just nearest

        // "medians"
        var m1 = left  + third;
        var m2 = right - third;

        if (m1 <= left) {
            m1 = left + 1;
        if (m2 >= right) {
            m2 = right - 1;
        if (arr[m1] < arr[m2]) {
            swap(arr, m1, left);
            swap(arr, m2, right);
        else {
            swap(arr, m1, right);
            swap(arr, m2, left);
        // pivots
        var pivot1 = arr[left];
        var pivot2 = arr[right];

        // pointers
        var less  = left  + 1;
        var great = right - 1;

        // sorting
        for (var k = less; k <= great; k++) {
            if (arr[k] < pivot1) {
                swap(arr, k, less++);
            else if (arr[k] > pivot2) {
                while (k < great && arr[great] > pivot2) {
                swap(arr, k, great--);

                if (arr[k] < pivot1) {
                    swap(arr, k, less++);
        // swaps
        var dist = great - less;

        if (dist < 13) {
        swap(arr, less  - 1, left);
        swap(arr, great + 1, right);

        // subarrays
        dualPivotQuicksort(arr, left,   less - 2, div);
        dualPivotQuicksort(arr, great + 2, right, div);

        // equal elements
        if (dist > len - 13 && pivot1 != pivot2) {
            for (var k = less; k <= great; k++) {
                if (arr[k] == pivot1) {
                    swap(arr, k, less++);
                else if (arr[k] == pivot2) {
                    swap(arr, k, great--);

                    if (arr[k] == pivot1) {
                        swap(arr, k, less++);
        // subarray
        if (pivot1 < pivot2) {
            dualPivotQuicksort(arr, less, great, div);
    return dualPivotQS;


Javascript Modular Design

So I just completed my update of my Facebook app My Amazon Wishlist  I converted all of my messy initialization and jquery UI event handling code into a nice neat single variable javascript module.  It is sooooo much prettier….cleaner, more modular (it’s right there in the name!), less likely to interfere with any other javascript includes.  I briefly contemplated creating javascript objects for the lists….if I were to recode it from the start that’s how I would do it, but for now I’ll leave it as is because cloning the objects is faster and it can stay as is until I decide to implement more features.  The back-end is all php objects, and has made it incredibly easy to update and add features, as well as re-use in my wordpress plugin.  I also updated from jQuery 1.8.1 to 1.10 and had to update a few deprecated function calls that were removed in 1.9 ….GO $(document).on(‘click’, ‘id’, function(){});!…I also had to replace window.parent.document calls inside an iframe with  with jQuery postmessage calls….they actually turned out quite nicely….you have to setup a listener in the parent frame, then in the iframe (or another website solves the cross-domain problem). post the message to the parent.  In my case I was sending multiple pieces of data, so I put it into an array, stringified it and passed it to the parent.  One thing I would note….in the listener you should make it so that it only accepts messages from the website you expect to be sending messages.  Here’s the code from the iframe:

var msgArray = new Object();
msgArray[‘newHeight’] = $(‘#aws-item’).height().toString();
msgArray[‘listNo’] = listNo;
msgArray[‘itemNo’] = itemNo;
msgArray[‘iframeID’] = iframeID;
var message = JSON.stringify(msgArray);
parent.window.postMessage(message, ‘*’);

And so ends my first article for my new website!  I think I’ll write the next one on how I localized (L10N) my WordPress plugin