ERNI Technology Post No. 60: ASP .NET core as an easy cross-platform solution

The .NET Core framework has been here for some time and it claims to be a cross-platform solution. Nevertheless, customers still feel a little uncomfortable when somebody proposes a .NET Core solution for their non-Windows environment. Nobody doubts that .NET Core can be deployed and run in Azure cloud, but can it also be that easy with a non-Windows environment outside Azure? In this article, we’ll try to help answer this question by giving an example from a real-world project.

A couple of months ago, we were asked to estimate how much effort it would take to provide a web application solution that would run in a customer’s private cloud on Linux OS. The solution is a typical three-layer client-server application with Angular frontend, REST API backend and relational database storage in the back. As a target technology, we decided to use an ASP.NET Core framework and Docker container platform. We unsurprisingly decided for Visual Studio 2017 as our development environment.

Solution overview

The prerequisites for the solution were as follows:

  • ASP .NET Core 2.0 REST API as backend

  • Angular 4 as frontend

  • Entity Framework (EF) as ORM mapper

  • MariaDB 5.5.42 as RDMS

  • Docker image as a container platform for the whole solution

We will create a Docker image that would contain a REST service and an Angular frontend application. The following figure shows the deployment diagram of our application in the customer’s private cloud.

REST API service - proof of concept

Our first steps involved finding out if there could be some fundamental problems with an ASP .NET Core solution running in the customer’s private cloud on Linux OS. To answer this question, we decided to deploy a simple REST API service to the mentioned cloud.

We started by downloading the Docker install package for Windows from the official Docker website (https://www.docker.com/docker-windows – the downloaded package is a native Windows application) and installing it on our development machine. We performed the following basic configuration through Docker’s graphical user interface:

  • switched Docker into Linux container mode (as Linux OS is the target platform)

  • selected local drives which should be available for our containers

For a more detailed overview of what Docker is and how to use it, please see our previous technical letter on this topic: https://www.erni-consultants.com/en/blog/technology/erni-technology-post-no-52-using-docker-continuous-delivery 

We opened Visual Studio IDE and created a new solution by adding an ASP.NET Core Web Application project into it – we used the built-in Web API template for this. We also checked the Enable Docker support option (Linux containers) when creating this type of project from template. The created solution contained two projects - our Restful HTTP backend service project and a Docker-compose project. We set the Docker-compose project as a StartUp project and rebuilt the solution to check if there were any problems.

Running the Docker container can be done directly from the Visual Studio by running the Docker-compose project. This will build and deploy the image into the local Docker repository, then create and start a container instance from it. We opened PowerShell and ran the Docker images command, which shows a list of Docker images in the repository (excluding intermediate images). The Docker ps command shows a list of running containers.

As you can see in the above figure, there are two images in the Docker repository on our development machine. The first image contains our webapplication1 project and the second is an ASP.NET Core 2.0 image – an official image for running compiled ASP.NET Core applications. The second list in the figure (under the Docker ps command) shows running containers with some additional information. In our case, there is only one container running, which represents our REST service application (created from image named webapplication1). The essential information here is port number 32772 on which our application is listening. While running a Docker-compose project in Visual Studio, you can reach your application at the following address: http://localhost:32772/api/values.

The next step was to deploy this REST service into the customer’s cloud. For this, we used the default implementation of the REST service provided by the Visual Studio template. We considered this as sufficient for a quick test as to whether the whole concept could work. We submitted the created Docker image into the Docker hub repository (https://hub.docker.com/) using the Docker tag and Docker push commands. In the figure below, you can see that we tagged our webapplication1 image as kuto/marketplace:version1.0 and then pushed it into the Docker hub repository named kuto/marketplace.

Then, we selected an option in the customer’s private cloud to deploy the Docker image from the public repository in the Docker hub. The test was successful without any complications – we were able to access the REST API service hosted in the cloud.

Integrating MariaDB into the solution

The final step was the integration of a relational database engine into our solution. The only relational database that we had available within the current customer’s infrastructure was MariaDB, which basically is a MySQL database.

We created a new .NET Core Class Library project within our solution that would represent our data model and added Pomelo EntityFramework Core dependency to it through NuGet (https://www.nuget.org/packages/Pomelo.EntityFrameworkCore.MySql). We decided to use code-first approach for our database model. We will not give any implementation or configuration details regarding EF code-first approach setup, as this is beyond the scope of this article. We created a Person entity with ID, Name and Surname properties. We rebuilt the solution and ran the REST service project locally on our development machine.

After a successful test in our local environment, we updated the Docker image in the public repository on Docker hub using the tag and push commands, but this time we tagged our image as kuto/marketplace:version2.0 (version 2.0 of our REST service). We started testing our updated REST service in the cloud – we checked that the database schema was created at application startup and we added a sample person into the database by sending a POST request to the updated REST API. The test was successful.

Conclusion

In this article, we’ve tried to convey our story about the ease of deploying an ASP.NET Core application into a non-Windows environment outside Azure. The application was a simple REST service with MariaDB support packed as a Docker image. The purpose was neither to give a detailed description of how to build such an application nor to explain how to configure Docker or a MariaDB connection. It was rather about giving our point of view and experience on the topic.

The main issue we struggled with was setting up an EF Core code-first approach with a MariaDB database. We did not experience any issues regarding data provider compatibility with MariaDB; we only lacked the know-how how to setup the data access layer correctly.

To sum up, the whole deployment process was quite smooth. We had no issues with deployment of our ASP.NET Core 2.0 REST service to the Linux OS environment (hosted in a docker image). The application was immediately available to the public.

posted on 16.10.2017
Categories: 
by: Tomáš Kučečka and Jakub Šturc