How to setup kdiff3 as a git mergetool in WSL

I have been developing in windows WSL recently and I encountered a problem, when in case of a merge conflict how to use a good GUI mergetool.

I really like the kdiff3 on windows, so here is how you can set it up as a default mergetool in Git

  1. Download and install the latest version of kdiff3 for windows fromhere http://kdiff3.sourceforge.net/
  2. Setup kdiff3 in your path, so that WSL can easily find it
    1. Click on start button and search env and open “Edit the system environment variables”

2. Then click on Enviroment Variables button

3. Then select Path and click Edit button

4. Then click New and Add the Path where KDiff3 is installed under Program Files

Setting up default merge tool in Git

Open WSL terminal and type in the following commands to set kdiff3 as the default

That’s it! Now if you run git mergetool command, kidff3 will open up as your default merge tool.

Using Mosh with Mac when installed with homebrew

When you install mosh with homebrew and if you try to connect to your computer, you will get an error message command not found mosh-server it is because when installed with homebrew, when you ssh into the computer, the path to the mosh-server binary is not present in the PATH.

To solve this issue you just need to pass the –server parameter with the path to the mosh-server binary which in our case would be at /usr/local/bin/mosh-server

So in summary use the following command to connect to your mac via mosh

mosh --server='/usr/local/bin/mosh-server' <username>@<address_of_remote_server>

Example:

mosh --server='/usr/local/bin/mosh-server' mohammed@192.168.1.20

How to make better decisions

After reading this post you will not instantly become a good decision-maker, but in this blog post, I will introduce you to a technique, that if you choose to adopt it, it will make you a good decision maker after some years of using it.

This technique is to write down whenever you are making a decision

  1. Write down the decision that you are making
  2. Write down the expected outcome of the decision
  3. Add a heading 6-month review
  4. Add a heading 12-month review
  5. Create 6-month and 12-month reminders and review the decision

If you want to go full throttle also add a heading 10-year review.

Then set a reminder, I typically use Evernote for this, I have a notebook in Evernote dedicated to this, I create a note in the notebook for each big decision that I am making and set a reminder to remind me after 6-months and after 12-months to write down the actual outcome, then I also write down the learning I got from that.

By doing this over the years you will have a great record of your right and wrong decisions when you go over them our incredible brain has the ability to seek patterns in the decisions and you will be able to realize in making what type of decisions you are good at and where you suck and the pitfalls you fall into, and you will be able to greatly improve your decision making ability.

Let me know in the comments your ideas and suggestions regarding the above approach.

ssh-copy-id: The easiest way to copy ssh keys to another machine

I wanted to copy ssh keys to the server so that I could log in without requiring to enter the password. To do that typically I would ssh into the server and copy the public key of my laptop the authorized_keys on the remote server.

I came across a better way and I realized I have been doing it wrong for a very long time and there is a much simpler way to do it, using the ssh-copy-id command.

Just run the following command in the terminal

e.g

It will prompt you to enter the password, and then after that, it will copy the public key on your computer to the authorized_keys on the remote computer, and now you will be able to SSH without entering the password.

That’s it! Let me know if you have any questions or comments.

Disable password prompt when running sudo via SSH

Recently I came across a problem. I created a deploy script that would SSH into the server and run a bunch of commands, and some of those commands required sudo, but when running commands as sudo presented a password prompt, which was a problem.

I solved this by updating the sudoers file in ubuntu. If you ever face this problem all you have to do it update the file /etc/sudoers on your server

Add this line to the end of the file

That’s it! Feel free to comment if you have any questions or suggestions.

VNC into the tinkerboard with x11vnc

I wanted to vnc into my tinkerboard, and using the default mac vnc client. I tried different packages, like RealVNC but that has license only for RaspberryPi, then I tried tightvnc but it was not working with the default mac client and getting tigervnc up was also not a smooth process, as it was giving error related to fonts.

Then, finally I came across x11vnc and it worked like a charm. To install it, simply run the following command on your tinkerboard

Once installed you can start the vnc server using:

If you want to run it with the password, first set a password using the following command

Now you can run the x11vnc with the password, run the command

Now, on the mac if you want to connect to the tinkerboard in the terminal type

It will launch the vnc the client and prompt you to enter the password, and you’ll be in.

That’s it! Feel free to comment if you have any questions or suggestions.

Running Jenkins behind NGINX Proxy

I have a Jenkins server with a private IP address and an NGNIX server with a public IP. I wanted to point the domain for the Jenkins server to the NGINX server and the NGINX server would forward the request the Jenkins server.

I got this working through NGINX proxy pass, here is NGNIX configuration that I have used to get it working.

Assume the domain used for Jenkins server is jenkins.mydomain.com and the IP address of the Jenkins server is 192.168.1.20 running at port 8080.

On the Nginx server we create the file under the sites-available folder, here is the exact path: /etc/nginx/sites-available/jenkins.mydomain.com

Then linked the jenkins.mydomain.com file to sites-enabled

sudo ln -s /etc/nginx/sites-available/jenkins.mydomain.com /etc/nginx/sites-enabled/

Test the updated configuration

sudo nginx -t

Restart the nginx server

sudo systemctl restart nginx

That’s it! Let me know if you have any questions or comments.

Running your apps on your own server – This site is running on my own server

I have two software projects and a bunch of websites and databases running on the internet. They cost me around $1000 /mo. in terms of server cost to run, and I was sick of the slow performance and high costs of the computers running in the cloud.

I have been inspired by the Jeff Atwoods blogs posts (particularly this and this) for a long time. When he first blogged about running your own server, is much cheaper in the long run and then his recent blog post on running mini-pc server, and how the performance of the mini-pc was much better than the cloud computing which costs more than $5700 to run over the period of 3 years. Just to be clear nothing beats the flexibility of cloud computing, you can spin up a server in a matter of seconds and it is great for testing out ideas and running your application for some time, which I have been doing, but if you want to run something for a long term you are better off running your own server.

And I have been reading a lot about the cheap 1U server and few years old Xeon CPUs coming to the used market and selling ludicrously cheap, so I finally decided that I will buy a used server and host all my applications myself.

Hosting the server in your home is not a particularly great idea, as data centres are getting quite inexpensive it is best to put the server in the data center, so I decided to put mine in the Nuday networks datacenter. They also offer an IPMI connection to my server over the VPN which is an amazing service that would allow you to restart/boot/manage the server remotely, and it is quite cheap at just $59 CAD /mo. (they took a $99 one-time setup fee as well)

So, I was looking for used servers and I found a great one, a Supermicro 1U server with dual Xeon 6 core CPUs. A total of 12 cores and 24 threads and 64 GB of RAM.  I installed the VMware Hyper-v on it and created virtual machines that I require. 

I will be writing more about how I configured the server and setup the firewall and everything.

My server is currently running 6 VMs:

  1. MongoDB server
  2. MariaDB server
  3. Nginx server
  4. Dead Simple Chat Server
  5. Dead Simple Screen Sharing Server
  6. Firewall

There is an internal network through all the VMs are connected to each other sort of a VPC that we can have in AWS and the Firewall provides a VPN, so I can connect via VPN and manage the services.

This site is currently running on the Nginx Server along with few other sites, and the performance is really compared as compared to other hosting options.

So far I am very happy with my setup and it costs a fraction to run as compared to the services running in Google Cloud and AWS and is approximately 10 times more powerful.

If you want services to running for a long time then I think hosting your own servers is the best way to go.

That’s it! Feel free to comment if you have any questions or suggestions.

Maximizing productivity when working on multiple software development projects

Software development is a creative pursuit and to enable the creative mind some cues are required, to get in the zone and kickstart the process.

I am a web and mobile app developer who works on multiple projects during the day, and to enable myself doing so I have created different zones for each project, where I can get in the zone and work on my craft, working on multiple projects requires some amount context, and switching context is taxing on the brain.

So in order to work on multiple projects during that day, I work on one project at a time, with a dedicated time slot for each project and a different workspace for each project.

A different workspace means a different desk, chair and a computer for each of my projects and ideally at a different location. Currently, I am working on 3 projects, so I have 3 computers and 3 desks, each dedicated to a separate project.

Now some of you might think it is wasteful, but trust me it is not. Dedicating a workspace for a project is one of the best money I have ever spent because after I am done working on one project, I do not have to close the editor, the tabs etc. I can just leave the things where they are and come in the next day and continue from where I left.

If you look at the desk of creative geniuses like for e.g Einstein or Steve Jobs they have messy desks, and the programs open in your computer is akin to things on your desk that you require when working on your project easily accessible so that you don’t have to break the flow of the what your working on and quickly access the programs and files and keep going and working productively and seamlessly resume where you left of.

By closing all the windows on the computer to switch to a different project and then coming back to the first project you have to re-think and re-load all the variables again in your brain, whereas if they are already open you still have to do some mental loading of the task but it is far less as compared to the re-opening all the stuff e.g the editor and browser tabs.

Steve Jobs Desk at his home office
Einstein’s Last Desk

With computers being more and more affordable it is an incredible luxury afforded to us by the modern times that I can dedicate an entire computer for one project and it has been a great boon for my productivity as I do not have to load things into my brain, when I sit on the desk for a project I just get going.

So if you are a programmer that works on a single project at a time then great for you! But if your profession demands you to work on multiple projects then trust me dedicating a separate workspace for each of the projects would be a great investment.

Conculsion

So in summary based on my experience the best way to work on multiple projects is to have a dedicated workspace for each project so that you can get in the zone with minimal effort.

If you have something more please share your thoughts in the comments.

DeadSimpleScreenSharing 2: Open source browser based self-hosted screen sharing

This blog post is about a project that I have built, which is called DeadSimpleScreenSharing 2, which the next version of DeadSimpleScreenSharing, and it is much better and faster than the previous version.

It offers audio conference out of the box and supports sharing your screen with any number of users just by sharing a URL, and it is very high quality and super fast.

I am also offering a self-hosted version of the application that you can run on your own server, and the self-hosted version is also white label so you can rebrand it with your organisation’s brand name.

So, here is how you can use this super simple service:

Step 1: Go to http://deadsimplescreensharing.com and click the “Host a Meeting” button

Step 2: It will take you the chrome extension page, where you’ll have to install the extension

Step 3: After installing the extension, click the extension icon, and a window will appear, in that window click the “Host a Meeting” button

Step 4: Done! Your screen is being shared, you can share the URL with others so that they can join your session.