Deploying a httpd application on the K8s having PVCs, PVs, Services(NodePort), and NFS-Dynamic centralized storage provisioner.

The Job DSL plugin attempts to solve this problem by allowing jobs to be defined in a programmatic form in a human-readable file. Writing such a file is feasible without being a Jenkins expert as the configuration from the web UI translates intuitively into code.
https://plugins.jenkins.io/job-dsl/

Prerequisites:-

A pre-installed K8s cluster(e.g. minikube). In minikube by default, there is no
internal NFS dynamic provisioner is available for the storage class so it can claim a PVC or PV dynamically. so we are creating a NFS-client dynamic provisioner…


Iterate from image annotation to accurate neural networks 10x faster

Supervise.ly is a powerful platform for computer vision development, where individual researchers and large teams can annotate and experiment with datasets and neural networks.

Tasks to be created:-

Create a project designed to solve the real use case,
using either transfer learning example existing Mask-RCNN, VGG16, etc.
or creating new model of Mask-RCNN, GANs, RNN, etc. to solve any real
case problems or new problems.

Necessary requirements:-
1. Make your own custom dataset using supervisely
2. Either create a new model or using existing model as transfer learning
3. Launch the training on aws cloud

Prerequisites:-

  1. Create a free account…

https://aws.amazon.com/eks/

Amazon Elastic Kubernetes Service (Amazon EKS) is a fully managed Kubernetes service. EKS runs upstream Kubernetes and is certified Kubernetes conformant so you can leverage all benefits of open source tooling from the community. You can also easily migrate any standard Kubernetes application to EKS without needing to refactor your code. Current EKS supports K8s v 1.14, 1.15, 1.16(default).

Prerequisite:- A station is required where aws cli v2, eksctl, and kubectl commands preconfigured. An IAM user with enough policies so that it can create an AWS EKS cluster and AWS EFS file system in the same VPC. …


( jobs building on containers == Dynamic Worker Nodes).

Tasks to be created:-

1. Create container image that has linux distribution and other basic configuration required to run a cloud worker node for Jenkins.
(e.g. Here we require kubectl to be configured inside that node.)
2. When we launch the job it should automatically starts job on cloud worker node based on the labels provided for dynamic approach to run the jobs.
3. Create a job chain of job1 & job2 using build pipeline plugin in Jenkins.
4. Job1 :- Pull the Github repo automatically when some developers push repo to Github(using local hooks and web-hooks) and perform the following operations as:
4.1…


To create resources like Pods, Deployment, PVC and Service, etc on the top of K8s.

Tasks to be created:-

1. Create container image that’s has Jenkins installed using Dockerfile Or You can use the Jenkins Server on RHEL 8/7
2. When we launch this image, it should automatically starts Jenkins service in the container.
3. Create a job chain of job1, job2, job3 and job4 using build pipeline plugin in Jenkins
4. Job1: Pull the Github repo automatically when some developers push repo to Github.
5. Job2 :
1. create a persistent volume claim.
2. create service for the application.
3. create a deployment for the application.
6. Job3: Test your app if it is working or not.
7…


end-to-end automation

Creating AWS infrastructure ( CloudFront + S3+ EC2 Instances) using the Terraform tool with the HCL(HashiCorp Language) scripts and ansible engine is used for infrastructure configuration management.

Pre-requisites:- Preconfigured AWS CLI, ansible engine, Terraform CLI, IAM-user with administrative powers.

[root@server terraform]# aws configure
AWS Access Key ID [********************]:
AWS Secret Access Key [********************]:
Default region name [ap-south-1]:
Default output format [None]:
[root@server terraform]# ansible --version
ansible 2.8.3
config file = /root/terraform/ansible.cfg
configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /usr/local/lib/python3.6/site-packages/ansible
executable location = /usr/local/bin/ansible
python version = 3.6.8 (default, Jan 11 2019, 02:17:16) [GCC 8.2.1 20180905]

Build Pipeline View

Task description

  1. Create container image that’s has Python3 and Keras or NumPy installed using Dockerfile
  2. When we launch this image, it should automatically starts to train the model in the container.
  3. Create a job chain of job1, job2, job3, job4 and job5 using build pipeline plugin in Jenkins
  4. Job1 : Pull the Github repo automatically when some developers push repo to Github.
  5. Job2 : By looking at the code or program file, Jenkins should automatically start the respective machine learning software installed interpreter install image container to deploy code and start training( eg. …


Using Keras and VGG16(pre-trained model already trained on ImageNet dataset) used for transfer learning.

Transfer Learning

Pre-requisites-

  1. Install Anaconda Python Distribution.
  2. Install various python libraries # pip install -r requirements.txt
  3. Run command prompt, > jupyter notebok

Step 1

Clone the repo # git clone https://github.com/A4ANK/face_recognition_Transfer_Learning.git In repo directory, create two sub directories # mkdir train # mkdir test Similarly, also create subdirectories inside test and train directories for subcategories that you want to predict, respectively. Now, collect dataset using face_extractor.ipynb notebook

Step 2

Load the VGG16 pre-trained model inside Transfer Learning.ipynb Use transfer learning to train this pre-trained model on the new smaller dataset created using face_extractor.ipynb notebook.

Step 3

Run Real Time Face recognizer.ipynb notebook to perform.

Github repo:- https://github.com/A4ANK/face_recognition_Transfer_Learning


Integrating Jenkins with local git hooks and deploying the source code on docker containers using PollSCM triggers.

Three jobs are needed for simulating this project.

  1. Job 1

For deploying testing environment on the top of docker using git hooks(post-commit) when any commits are done from the featured branch( other than master(main branch)) and the job is scheduled using PollSCM.

git hooks => post-commit script vi .git/hooks/post-commit

#!/bin/bash echo “First and then Post Commit Tasks are started”
git fetch
git push
echo git push is done to the current Remote Branch”
#echo “remote Build Trigger using jenkins URL”
#curl — user “username:password” <
http:///job/job3/build?token=TOKEN>

Job1

— — — — — — — — — — — — — — —…

ANKUR DHAKAR

I'm a computer science undergraduate and my primary area of work is under Linux, CloudComputing, DevOps culture, and various open-source tools and technologies

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store