Merge branch 'MichaelCade:main' into main

This commit is contained in:
Sourav Kumar 2022-06-22 12:39:59 +05:30 committed by GitHub
commit 771ae544e5
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
52 changed files with 701 additions and 740 deletions

32
Days/Linux/create-user.sh Normal file → Executable file
View File

@ -1,15 +1,27 @@
#! /usr/bin/bash #! /usr/bin/bash
if [ -z "${1}" ]
then
echo "What is your intended username?"
read username
echo "What is your password"
read password
echo "What is your intended username?" #A user can be passed in as a command line argument
read username echo "$username user account being created."
echo "What is your password"
read password
#A user can be passed in as a command line argument #A user is created with the name of command line argument
echo "$username user account being created." sudo useradd -m $username
#A user is created with the name of command line argument #A password can be parsed in as a command line argument.
sudo useradd -m $username sudo chpasswd <<< $username:$password
sleep 2
echo "If you want to delete the user then pass 'del' and username in command line argument. e.g: ./create-user.sh del username"
else
sudo userdel -rf "${2}"
sleep 2
echo "${2} user account successfully deleted."
exit 0
fi
#A password can be parsed in as a command line argument.
sudo chpasswd <<< $username:$password

View File

@ -12,7 +12,7 @@ date: '2022-04-17T10:12:40Z'
Day 1 of our 90 days and adventure to learn a good foundational understanding of DevOps and tools that help with a DevOps mindset. Day 1 of our 90 days and adventure to learn a good foundational understanding of DevOps and tools that help with a DevOps mindset.
This learning journey started for me a few years back but my focus then was around virtualisation platforms and cloud based technologies, I was looking mostly into Infrastructure as Code and Application configuration management with Terraform and Chef. This learning journey started for me a few years back, but my focus then was around virtualisation platforms and cloud-based technologies, I was looking mostly into Infrastructure as Code and Application configuration management with Terraform and Chef.
Fast forward to March 2021, I was given an amazing opportunity to concentrate my efforts around the Cloud Native strategy at Kasten by Veeam. Which was going to be a massive focus on Kubernetes and DevOps and the community surrounding these technologies. I started my learning journey and quickly realised there was a very wide world aside from just learning the fundamentals of Kubernetes and Containerisation and it was then when I started speaking to the community and learning more and more about the DevOps culture, tooling and processes so I started documenting some of the areas I wanted to learn in public. Fast forward to March 2021, I was given an amazing opportunity to concentrate my efforts around the Cloud Native strategy at Kasten by Veeam. Which was going to be a massive focus on Kubernetes and DevOps and the community surrounding these technologies. I started my learning journey and quickly realised there was a very wide world aside from just learning the fundamentals of Kubernetes and Containerisation and it was then when I started speaking to the community and learning more and more about the DevOps culture, tooling and processes so I started documenting some of the areas I wanted to learn in public.
@ -20,31 +20,31 @@ Fast forward to March 2021, I was given an amazing opportunity to concentrate my
## Let the journey begin ## Let the journey begin
If you read the above blog, you will see this is a high level contents for my learning journey and I will say at this point I am no where near an expert in any of these sections but what I wanted to do was share some resources both FREE and some paid for but an option for both as we all have different circumstances. If you read the above blog, you will see this is a high-level contents for my learning journey and I will say at this point I am nowhere near an expert in any of these sections but what I wanted to do was share some resources both FREE and some paid for but an option for both as we all have different circumstances.
Over the next 90 days I want to document these resources and cover those foundational areas. I would love for the community to also get involved. Share your journey and resources so we can learn in public and help each other. Over the next 90 days, I want to document these resources and cover those foundational areas. I would love for the community to also get involved. Share your journey and resources so we can learn in public and help each other.
You will see from the opening readme in the project repository that I have split things into sections and it is basically 12 weeks plus 6 days. The first 6 days we will explore the fundamentals of DevOps in general before diving into some of the specific areas. By no way is this list exhaustive and again,I would love for the community to assist in making this a useful resource. You will see from the opening readme in the project repository that I have split things into sections and it is 12 weeks plus 6 days. For the first 6 days, we will explore the fundamentals of DevOps in general before diving into some of the specific areas. By no way is this list exhaustive and again, I would love for the community to assist in making this a useful resource.
Another resource I will share at this point and that I think everyone should have a good look at, maybe create your own mind map for yourself and your interest and position, is the following: Another resource I will share at this point and that I think everyone should have a good look at, maybe create your mind map for yourself and your interest and position, is the following:
[DevOps Roadmap](https://roadmap.sh/devops) [DevOps Roadmap](https://roadmap.sh/devops)
I found this a great resource when I was creating my initial list and blog post on this topic. You can also see there are other areas that go into a lot more detail outside of the 12 topics I have listed here in this repository. I found this a great resource when I was creating my initial list and blog post on this topic. You can also see other areas go into a lot more detail outside of the 12 topics I have listed here in this repository.
## First Steps - What is DevOps? ## First Steps - What is DevOps?
There are so many blog articles and YouTube videos to list here, but as we start the 90 day challenge and we focus on spending around an hour a day learning something new or about DevOps, I thought it was good to get some of the high level of "what DevOps is" down to begin. There are so many blog articles and YouTube videos to list here, but as we start the 90-day challenge and we focus on spending around an hour a day learning something new or about DevOps, I thought it was good to get some of the high level of "what DevOps is" down to begin.
Firstly, DevOps is not a tool. You cannot buy it, it is not a software sku or an open source GitHub repository you can download. It is also not a programming language, it is also not some dark art magic either. Firstly, DevOps is not a tool. You cannot buy it, it is not a software SKU or an open source GitHub repository you can download. It is also not a programming language, it is also not some dark art magic either.
DevOps is a way to do smarter things in Software Development. - Hold up... But if you are not a software developer should you turn away right now and not dive into this project??? No. Not at all. Stay... Because DevOps brings together a combination of software development and operations. I mentioned earlier that I was more on the VM side and that would generally fall under the Operations side of the house, but within the community there are people with all different backgrounds where DevOps is 100% going to benefit the individual, Developers, Operations and QA Engineers all can equally learn these best practices by having a better understanding of DevOps. DevOps is a way to do smarter things in Software Development. - Hold up... But if you are not a software developer should you turn away right now and not dive into this project??? No. Not at all. Stay... Because DevOps brings together a combination of software development and operations. I mentioned earlier that I was more on the VM side and that would generally fall under the Operations side of the house, but within the community, there are people with all different backgrounds where DevOps is 100% going to benefit the individual, Developers, Operations and QA Engineers all can equally learn these best practices by having a better understanding of DevOps.
DevOps is a set of practices that help to reach the goal of this movement: reducing the time between the ideation phase of a product and its release in production to the end-user or whomever it could be an internal team or customer. DevOps is a set of practices that help to reach the goal of this movement: reducing the time between the ideation phase of a product and its release in production to the end-user or whomever it could be an internal team or customer.
Another area we will dive into in this first week is around **The Agile Methodology**. DevOps and Agile are widely adopted together to achieve continuous delivery of your **Application**. Another area we will dive into in this first week is around **The Agile Methodology**. DevOps and Agile are widely adopted together to achieve continuous delivery of your **Application**.
The high level take away is that a DevOps mindset or culture is about shrinking the long, drawn out software release process from potentially years to being able to drop smaller releases more frequently. The other key fundamental point to understand here is the responsibility of a DevOps engineer to break down silos between the teams I previously mentioned: Developers, Operations and QA. The high-level takeaway is that a DevOps mindset or culture is about shrinking the long, drawn out software release process from potentially years to being able to drop smaller releases more frequently. The other key fundamental point to understand here is the responsibility of a DevOps engineer to break down silos between the teams I previously mentioned: Developers, Operations and QA.
From a DevOps perspective, **Development, Testing and Deployment** all land with the DevOps team. From a DevOps perspective, **Development, Testing and Deployment** all land with the DevOps team.

View File

@ -10,49 +10,49 @@ date: '2022-04-17T21:15:34Z'
--- ---
## Responsibilities of a DevOps Engineer ## Responsibilities of a DevOps Engineer
Hopefully you are coming into this off the back of going through the resources and post on [Day1 of #90DaysOfDevOps](day01.md) Hopefully, you are coming into this off the back of going through the resources and posting on [Day1 of #90DaysOfDevOps](day01.md)
It was briefly touched on in the first post but now we must get deeper into this concept and understand that there are two main parts when creating an application. We have the **Development** part where software developers program the application and test it. Then we have the **Operations** part where the application is deployed and maintained on a server. It was briefly touched on in the first post but now we must get deeper into this concept and understand that there are two main parts when creating an application. We have the **Development** part where software developers program the application and test it. Then we have the **Operations** part where the application is deployed and maintained on a server.
## DevOps is the link between the two ## DevOps is the link between the two
To get to grips with DevOps or the tasks in which a DevOps engineer would be carrying out we need to understand the tools or the process and overview of those and how they come together. To get to grips with DevOps or the tasks which a DevOps engineer would be carrying out we need to understand the tools or the process and overview of those and how they come together.
Everything starts with the application! You will see so much throughout that it is all about the application when it comes to DevOps. Everything starts with the application! You will see so much throughout that it is all about the application when it comes to DevOps.
Developers will create an application, this can be done with many different technology stacks and lets leave that to the imagination for now as we get into this later. This can also involve many different programming languages, build tools, code repository etc. Developers will create an application, this can be done with many different technology stacks and let's leave that to the imagination for now as we get into this later. This can also involve many different programming languages, build tools, code repositories etc.
As a DevOps engineer you won't be programming the application but having a good understanding of the concepts of how a developer works and the systems, tools and processes they are using is key to success. As a DevOps engineer you won't be programming the application but having a good understanding of the concepts of how a developer works and the systems, tools and processes they are using is key to success.
At a very high level you are going to need to know how the application is configured to talk to all of its required services or data services and then also sprinkle a requirement of how this can or should be tested. At a very high level, you are going to need to know how the application is configured to talk to all of its required services or data services and then also sprinkle a requirement of how this can or should be tested.
The application will need to be deployed somewhere, lets keep it generally simple here and make this a server, doesn't matter where but a server. This is then expected to be accessed by the customer or end user depending on the application that has been created. The application will need to be deployed somewhere, lets's keep it generally simple here and make this a server, doesn't matter where but a server. This is then expected to be accessed by the customer or end user depending on the application that has been created.
This server needs to run somewhere, on-premises, in a public cloud, serverless (Ok I have gone too far, we won't be covering serverless but its an option and more and more enterprises are heading this way) Someone needs to create and configure these servers and get them ready for the application to run. Now this element might land to you as a DevOps engineer to deploy and configure these servers. This server needs to run somewhere, on-premises, in a public cloud, serverless (Ok I have gone too far, we won't be covering serverless but its an option and more and more enterprises are heading this way) Someone needs to create and configure these servers and get them ready for the application to run. Now, this element might land to you as a DevOps engineer to deploy and configure these servers.
These servers run an operating system and generally speaking this is going to be Linux but we have a whole section or week where we cover some of the foundational knowledge you should gain here. These servers run an operating system and generally speaking this is going to be Linux but we have a whole section or week where we cover some of the foundational knowledge you should gain here.
It is also likely that we need to communicate with other services in our network or environment, so we also need to have that level of knowledge around networking and configuring that, this might to some degree also land at the feet of the DevOps engineer. Again we will cover this in more detail in a dedicated section talking all things DNS, DHCP, Load Balancing etc. It is also likely that we need to communicate with other services in our network or environment, so we also need to have that level of knowledge around networking and configuring that, this might to some degree also land at the feet of the DevOps engineer. Again we will cover this in more detail in a dedicated section talking about all things DNS, DHCP, Load Balancing etc.
## Jack of all trades, Master of none ## Jack of all trades, Master of none
I will say at this point though, you don't need to be a Network or Infrastructure specialist you need a foundational knowledge of how to get things up and running and talking to each other, much the same as maybe having a foundational knowledge of a programming language but you don't need to be a developer. However you might be coming into this as a specialist in an area and that is a great footing to adapt to other areas. I will say at this point though, you don't need to be a Network or Infrastructure specialist you need a foundational knowledge of how to get things up and running and talking to each other, much the same as maybe having a foundational knowledge of a programming language but you don't need to be a developer. However, you might be coming into this as a specialist in an area and that is a great footing to adapt to other areas.
You will also most likely not take over the management of these servers or the application on a daily basis. You will also most likely not take over the management of these servers or the application daily.
We have been talking about servers but the likelihood is that your application will be developed to run as containers, Which still runs on a server for the most part but you will also need an understanding of not only virtualisation, Cloud Infrastructure as a Service (IaaS) but also containerisation as well, The focus in these 90 days will be more catered towards containers. We have been talking about servers but the likelihood is that your application will be developed to run as containers, Which still runs on a server for the most part but you will also need an understanding of not only virtualisation, Cloud Infrastructure as a Service (IaaS) but also containerisation as well, The focus in these 90 days will be more catered towards containers.
## High Level Overview ## High-Level Overview
On one side we have our developers creating new features and functionality (as well as bug fixes) for the application. On one side we have our developers creating new features and functionality (as well as bug fixes) for the application.
On the other side we have some sort of environment, infrastructure or servers which are configured and managed to run this application and communicate with all its required services. On the other side, we have some sort of environment, infrastructure or servers which are configured and managed to run this application and communicate with all its required services.
The big question is how do we get those features and bug fixes into our production and make it available to those end users? The big question is how do we get those features and bug fixes into our products and make them available to those end users?
How do we release the new application version? This is one of the main tasks for a DevOps engineer, and the important thing here is not to just figure out how to do this once but we need to do this continuously and in an automated, efficient way which also needs to include testing! How do we release the new application version? This is one of the main tasks for a DevOps engineer, and the important thing here is not to just figure out how to do this once but we need to do this continuously and in an automated, efficient way which also needs to include testing!
This is where we are going to end this day of learning, hopefully this was useful. Over the next few days we are going to dive a little deeper into some more areas of DevOps and then we will get into the sections that dive deeper into the tooling and processes and the benefits of these. This is where we are going to end this day of learning, hopefully, this was useful. Over the next few days, we are going to dive a little deeper into some more areas of DevOps and then we will get into the sections that dive deeper into the tooling and processes and the benefits of these.
## Resources ## Resources

View File

@ -11,24 +11,23 @@ id: 1048825
As we continue through these next few weeks we are 100% going to come across these titles (Continuous Development, Testing, Deployment, Monitor) over and over again, If you are heading towards the DevOps Engineer role then repeatability will be something you will get used to but constantly enhancing each time is another thing that keeps things interesting. As we continue through these next few weeks we are 100% going to come across these titles (Continuous Development, Testing, Deployment, Monitor) over and over again, If you are heading towards the DevOps Engineer role then repeatability will be something you will get used to but constantly enhancing each time is another thing that keeps things interesting.
In this hour we are going to take a look at the high level view of the application from start to finish and then back round again like a constant loop. In this hour we are going to take a look at the high-level view of the application from start to finish and then back around again like a constant loop.
### Development ### Development
Let's take a brand new example of an Application, to start with we have nothing created, maybe as a developer you have to discuss with your client or end user on the requirements and come up with some sort of plan or requirements for your Application. We then need to create from the requirements our brand new application. Let's take a brand new example of an Application, to start with we have nothing created, maybe as a developer, you have to discuss with your client or end user the requirements and come up with some sort of plan or requirements for your Application. We then need to create from the requirements our brand new application.
In regards to tooling at this stage there is no real requirement here other than choosing your IDE and the programming language you wish to use to write your application. In regards to tooling at this stage, there is no real requirement here other than choosing your IDE and the programming language you wish to use to write your application.
As a DevOps engineer, remember you are probably not the one creating this plan or coding the application for the end user, this will be a skilled developer. As a DevOps engineer, remember you are probably not the one creating this plan or coding the application for the end user, this will be a skilled developer.
But it also would not hurt for you to be able to read some of the code so that you can make the best infrastructure decisions moving forward for your application. But it also would not hurt for you to be able to read some of the code so that you can make the best infrastructure decisions moving forward for your application.
We previously mentioned that this application can be written in any language. Importantly this should be maintained using a version control system, this is something we will cover also in detail later on and in particular we will dive into **Git**. We previously mentioned that this application can be written in any language. Importantly this should be maintained using a version control system, this is something we will cover also in detail later on and in particular, we will dive into **Git**.
It is also likely that it will not be one developer working on this project although this could be the case but even so best practices would require a code repository to store and collaborate on the code, this could be private or public and could be hosted or privately deployed generally speaking you would hear the likes of **GitHub or GitLab** being used as a code repository. Again we will cover these as part of our section on **Git** later on.
It is also likely that it will not be one developer working on this project although this could be the case even so best practices would require a code repository to store and collaborate on the code, this could be private or public and could be hosted or privately deployed generally speaking you would hear the likes of **GitHub or GitLab** being used as a code repository. Again we will cover these as part of our section on **Git** later on.
### Testing ### Testing
At this stage we have our requirements and we have our application being developed. But we need to make sure we are testing our code in all the various different environments that we have available to us or specifically maybe to the programming language chosen. At this stage, we have our requirements and we have our application being developed. But we need to make sure we are testing our code in all the different environments that we have available to us or specifically maybe to the programming language chosen.
This phase enables QA to test for bugs, more frequently we see containers being used for simulating the test environment which overall can improve on cost overheads of physical or cloud infrastructure. This phase enables QA to test for bugs, more frequently we see containers being used for simulating the test environment which overall can improve on cost overheads of physical or cloud infrastructure.
@ -38,9 +37,9 @@ The ability to automate this testing vs 10s,100s or even 1000s of QA engineers h
### Integration ### Integration
Quite importantly Integration is at the middle of the DevOps lifecycle. It is the practice of in which developers require to commit changes to the source code more frequently. This could be on a daily or weekly basis. Quite importantly Integration is at the middle of the DevOps lifecycle. It is the practice in which developers require to commit changes to the source code more frequently. This could be on a daily or weekly basis.
With every commit your application can go through the automated testing phases and this allows for early detection of issues or bugs before the next phase. With every commit, your application can go through the automated testing phases and this allows for early detection of issues or bugs before the next phase.
Now you might at this stage be saying "but we don't create applications, we buy it off the shelf from a software vendor" Don't worry many companies do this and will continue to do this and it will be the software vendor that is concentrating on the above 3 phases but you might want to still adopt the final phase as this will enable for faster and more efficient deployments of your off the shelf deployments. Now you might at this stage be saying "but we don't create applications, we buy it off the shelf from a software vendor" Don't worry many companies do this and will continue to do this and it will be the software vendor that is concentrating on the above 3 phases but you might want to still adopt the final phase as this will enable for faster and more efficient deployments of your off the shelf deployments.
@ -49,9 +48,9 @@ I would also suggest just having this above knowledge is very important as you m
### Deployment ### Deployment
Ok so we have our application built and tested against the requirements of our end user and we now need to go ahead and deploy this application into production for our end users to consume. Ok so we have our application built and tested against the requirements of our end user and we now need to go ahead and deploy this application into production for our end users to consume.
This is the stage where the code is deployed to the production servers, now this is where things get extremely interesting and it is where the rest of our 86 days dives deeper into these areas. Because different applications require different possibly hardware or configurations. This is where **Application Configuration Management** and **Infrastructure as Code** could play a key part in your DevOps lifecycle. It might be that your application is **Containerised** but also available to run on a virtual machine. Which then also leads us onto platforms like **Kubernetes** which would be orchestrating those containers and making sure you have the desired state available to your end users. This is the stage where the code is deployed to the production servers, now this is where things get extremely interesting and it is where the rest of our 86 days dives deeper into these areas. Because different applications require different possibly hardware or configurations. This is where **Application Configuration Management** and **Infrastructure as Code** could play a key part in your DevOps lifecycle. It might be that your application is **Containerised** but also available to run on a virtual machine. This then also leads us onto platforms like **Kubernetes** which would be orchestrating those containers and making sure you have the desired state available to your end users.
All of these bold topics we will go into more detail over the next few weeks to get a better foundational knowledge of what they are and when to use them. Of these bold topics, we will go into more detail over the next few weeks to get a better foundational knowledge of what they are and when to use them.
### Monitoring ### Monitoring
@ -61,11 +60,11 @@ But now we need to be sure that our end users are getting the experience they re
This section is also where we are going to capture that feedback wheel about the features that have been implemented and how the end users would like to make these better for them. This section is also where we are going to capture that feedback wheel about the features that have been implemented and how the end users would like to make these better for them.
Reliability is a key factor here as well, at the end of the day we want our Application to be available all the time it is required. This then lends to other **observability, security and data management** areas that should be continuously monitored and feedback can always be used to better enhance, update and release the application continuously. Reliability is a key factor here as well, at the end of the day we want our Application to be available all the time it is required. This then leads to other **observability, security and data management** areas that should be continuously monitored and feedback can always be used to better enhance, update and release the application continuously.
Some input from the community here specifcally [@_ediri](https://twitter.com/_ediri) mentioned also part of this continous process we should also have the FinOps teams involved. Apps & Data are running and stored somewhere you should be monitoring this continously to make sure if things change from a resources point of view your costs are not causing some major financial pain on your Cloud Bills. Some input from the community here specifically [@_ediri](https://twitter.com/_ediri) mentioned also part of this continuous process we should also have the FinOps teams involved. Apps & Data are running and stored somewhere you should be monitoring this continuously to make sure if things change from a resources point of view your costs are not causing some major financial pain on your Cloud Bills.
I think it is also a good time to bring up the "DevOps Engineer" mentions above, albeit there are many DevOps Engineer positions in the wild that people hold, this is not really the ideal way of positioning the process of DevOps. What I mean is from speaking to others in the community the title of DevOps Engineer should not be the goal for anyone because really any position should be adopting DevOps processes and the culture explained here. DevOps should be used in many different positions such as Cloud-Native engineer/architect, virtualisation admin, cloud architect/engineer, infrastructure admin. This is to name a few but the reason for using DevOps Engineer above was really to highlight the scope or the process used by any of the above positions and more. I think it is also a good time to bring up the "DevOps Engineer" mentioned above, albeit there are many DevOps Engineer positions in the wild that people hold, this is not the ideal way of positioning the process of DevOps. What I mean is from speaking to others in the community the title of DevOps Engineer should not be the goal for anyone because really any position should be adopting DevOps processes and the culture explained here. DevOps should be used in many different positions such as Cloud-Native engineer/architect, virtualisation admin, cloud architect/engineer, and infrastructure admin. This is to name a few but the reason for using DevOps Engineer above was really to highlight the scope of the process used by any of the above positions and more.
## Resources ## Resources

View File

@ -29,11 +29,11 @@ and delivery practices based on cooperation between software developers and oper
## What is the difference between Agile and DevOps ## What is the difference between Agile and DevOps
The difference is mainly the preoccupations. Agile and DevOps have different preoccupations but they are helping each other. Agile wants short iteration, which is only possible with the automation that DevOps brings. Agile wants the customer to try a specific version and quickly give feedback which is only possible if DevOps make the creation of new environment easy. The difference is mainly the preoccupations. Agile and DevOps have different preoccupations but they are helping each other. Agile wants short iteration, which is only possible with the automation that DevOps brings. Agile wants the customer to try a specific version and quickly give feedback which is only possible if DevOps make the creation of a new environment easily.
### Different participants ### Different participants
Agile focuses on optimising communication between end-users and developers while DevOps targets developers and operation team members. We could say that agile is outward-oriented towards customers whereas DevOps is a set of internal practices. Agile focuses on optimising communication between end-users and developers while DevOps targets developers and operation, team members. We could say that agile is outward-oriented toward customers whereas DevOps is a set of internal practices.
### Team ### Team
@ -41,7 +41,7 @@ Agile usually applies to software developers and project managers. The competenc
### Applied Frameworks ### Applied Frameworks
Agile has a lot of management frameworks to achieve flexibility and transparency: Scrum > Kanban > Lean > Extreme > Crystal > Dynamic > Feature-Driven. DevOps focuses on the development approach in collaboration but doesn't offer specific methodologies. However, DevOps promote practices like Infrastructure as Code, Architecture as Code, Monitoring, Self Healing, end to end test automation ... But per se this is not a framework, rather practices. Agile has a lot of management frameworks to achieve flexibility and transparency: Scrum > Kanban > Lean > Extreme > Crystal > Dynamic > Feature-Driven. DevOps focuses on the development approach in collaboration but doesn't offer specific methodologies. However, DevOps promote practices like Infrastructure as Code, Architecture as Code, Monitoring, Self Healing, end to end test automation ... But per se this is not a framework, but rather practice.
### Feedback ### Feedback
@ -49,7 +49,7 @@ In Agile the main source of feedback is the end-user while in DevOps the feedbac
### Target areas ### Target areas
Agile focuses on software development more than on deployment and maintenance. DevOps focuses on software development as well but its values and tools also cover deployment and post-release stages like monitoring, high availability, security and data protection. Agile focuses more on software development than deployment and maintenance. DevOps focuses on software development as well but its values and tools also cover deployment and post-release stages like monitoring, high availability, security and data protection.
### Documentation ### Documentation
@ -82,7 +82,7 @@ simultaneously just follow 7 steps:
1. Unite the development and operation teams. 1. Unite the development and operation teams.
2. Create build and run teams, all development and operational concerns are discussed by the entire DevOps team. 2. Create build and run teams, all development and operational concerns are discussed by the entire DevOps team.
3. Change your approach to sprints, and assign priority ratings to offer DevOps tasks that have the same value than development tasks. Encourage development and operations teams to exchange their opinion on other teams workflow and possible issues. 3. Change your approach to sprints, and assign priority ratings to offer DevOps tasks that have the same value as development tasks. Encourage development and operations teams to exchange their opinion on other teams' workflow and possible issues.
4. Include QA in all development stages. 4. Include QA in all development stages.
5. Choose the right tools. 5. Choose the right tools.
6. Automate everything you can. 6. Automate everything you can.

View File

@ -2,20 +2,20 @@
title: '#90DaysOfDevOps - Plan > Code > Build > Testing > Release > Deploy > Operate > Monitor > - Day 5' title: '#90DaysOfDevOps - Plan > Code > Build > Testing > Release > Deploy > Operate > Monitor > - Day 5'
published: false published: false
description: 90DaysOfDevOps - Plan > Code > Build > Testing > Release > Deploy > Operate > Monitor > description: 90DaysOfDevOps - Plan > Code > Build > Testing > Release > Deploy > Operate > Monitor >
tags: "devops, 90daysofdevops, learning" tags: 'devops, 90daysofdevops, learning'
cover_image: null cover_image: null
canonical_url: null canonical_url: null
id: 1048830 id: 1048830
--- ---
## Plan > Code > Build > Testing > Release > Deploy > Operate > Monitor > ## Plan > Code > Build > Testing > Release > Deploy > Operate > Monitor >
Today we are going to focus on the individual steps from start to finish and the continous cycle of an Application in a DevOps world. Today we are going to focus on the individual steps from start to finish and the continuous cycle of an Application in a DevOps world.
![DevOps](Images/Day5_DevOps8.png) ![DevOps](Images/Day5_DevOps8.png)
### Plan: ### Plan:
It all starts with the planning process this is where the development team gets together and figure out what types of features and bug fixes that they're going to roll out in their next sprint. This is an opportunity as a DevOps Engineer for you to get involved with that and learn what kinds of things are going to be coming your way that you need to be involved with and also influence their decisions or their path and kind of help them work with the infrastructure that you've built or steer them towards something that's going to work better for them in case they're not on that path and so one key thing to point out here is the developers or software engineering team is your customer as a DevOps engineer so this is your opportunity to work with your customer before they go down a bad path. It all starts with the planning process this is where the development team gets together and figures out what types of features and bug fixes they're going to roll out in their next sprint. This is an opportunity as a DevOps Engineer for you to get involved with that and learn what kinds of things are going to be coming your way that you need to be involved with and also influence their decisions or their path and kind of help them work with the infrastructure that you've built or steer them towards something that's going to work better for them in case they're not on that path and so one key thing to point out here is the developers or software engineering team is your customer as a DevOps engineer so this is your opportunity to work with your customer before they go down a bad path.
### Code: ### Code:
@ -23,7 +23,7 @@ Now once that planning session's done they're going to go start writing the code
### Build: ### Build:
This is where we'll kick off the first of our automation processes because we're going to take their code and we're going to build it depending on what language they're using it may be transpiling it or compiling it or it might be creating a docker image from that code either way we're going to go through that process using our ci cd pipeline This is where we'll kick off the first of our automation processes because we're going to take their code and we're going to build it depending on what language they're using it may be transpiring it or compiling it or it might be creating a docker image from that code either way we're going to go through that process using our ci cd pipeline
## Testing: ## Testing:
@ -35,7 +35,7 @@ Once those tests pass we're going to do the release process and depending again
## Deploy: ## Deploy:
which is the thing that we do next because deployment is like the end game of this whole thing because deployments when we put the code into production and it's not until we do that that our business actually realizes the value from all the time effort and hard work that you and the software engineering team have put into this product up to this point. which is the thing that we do next because deployment is like the end game of this whole thing because deployments are when we put the code into production and it's not until we do that that our business realizes the value from all the time effort and hard work that you and the software engineering team have put into this product up to this point.
## Operate: ## Operate:
@ -44,7 +44,7 @@ Once it's deployed we are going to operate it and operate it may involve somethi
## Monitor: ## Monitor:
All of the above parts lead to the final step because you need to have monitoring, especially around operational issues auto-scaling troubleshooting like you don't know All of the above parts lead to the final step because you need to have monitoring, especially around operational issues auto-scaling troubleshooting like you don't know
there's a problem if you don't have monitoring in place to tell you that there's a problem so some of the things you might build monitoring for are memory utilization CPU utilization disk space, api endpoint, response time, how quickly that endpoint is responding and a big part of that as well is logs. Logs give developers the ability to see what is happening without having to access production systems. there's a problem if you don't have monitoring in place to tell you that there's a problem so some of the things you might build monitoring for are memory utilization CPU utilization disk space, API endpoint, response time, how quickly that endpoint is responding and a big part of that as well is logs. Logs give developers the ability to see what is happening without having to access production systems.
## Rince & Repeat: ## Rince & Repeat:
@ -72,7 +72,7 @@ CI Release is Success = Continuous Deployment = Deploy > Operate > Monitor
You can see these three Continuous notions above as the simple collection of phases of the DevOps Lifecycle. You can see these three Continuous notions above as the simple collection of phases of the DevOps Lifecycle.
This last bit was a bit of a recap for me on Day 3 but think this actually makes things clearer for me. This last bit was a bit of a recap for me on Day 3 but think this makes things clearer for me.
### Resources: ### Resources:

View File

@ -13,9 +13,9 @@ DevOps to begin with was seen to be out of reach for a lot of us as we didn't ha
You will see from the second link below in references there are a lot of different industries and verticals using DevOps and having a hugely positive effect on their business objectives. You will see from the second link below in references there are a lot of different industries and verticals using DevOps and having a hugely positive effect on their business objectives.
Obviously the overarching benefit here is DevOps if done correctly should help your Business improve the speed and quality of software development. The overarching benefit here is DevOps if done correctly should help your Business improve the speed and quality of software development.
I wanted to take this Day to look at succesful companies that have adopted a DevOps practice and share some resources around this, This will be another great one for the community to also dive in and help here. Have you adopted a DevOps culture in your business? Has it been successful? I wanted to take this Day to look at successful companies that have adopted a DevOps practice and share some resources around this, This will be another great one for the community to also dive in and help here. Have you adopted a DevOps culture in your business? Has it been successful?
I mentioned Netflix above and will touch on them again as it is a very good model and advanced to what we generally see today even still but will also mention some other big name brands that are succeeding it seems. I mentioned Netflix above and will touch on them again as it is a very good model and advanced to what we generally see today even still but will also mention some other big name brands that are succeeding it seems.
@ -25,30 +25,28 @@ In 2010 Amazon moved their physical server footprint to Amazon Web Services (AWS
Amazon adopted in 2011 (According to the resource below) a continued deployment process where their developers could deploy code whenever they want and to whatever servers they needed. This enabled Amazon to achieve deploying new software to production servers on average every 11.6 seconds! Amazon adopted in 2011 (According to the resource below) a continued deployment process where their developers could deploy code whenever they want and to whatever servers they needed. This enabled Amazon to achieve deploying new software to production servers on average every 11.6 seconds!
## Netflix ## Netflix
Who doesn't use Netflix? obviously a huge quality streaming service with by all accounts at least personally a great user experience. Who doesn't use Netflix? a huge quality streaming service with by all accounts at least personally a great user experience.
Why is that user experience so great? Well the ability to deliver a service with no recollected memory for me at least of glitches requires speed, flexibility, and attention to quality. Why is that user experience so great? Well, the ability to deliver a service with no recollected memory for me at least of glitches requires speed, flexibility, and attention to quality.
NetFlix developers can automatically build pieces of code into deployable web images without relying on IT operations. As the images are updated, they are integrated into Netflixs infrastructure using a custom-built, web-based platform. NetFlix developers can automatically build pieces of code into deployable web images without relying on IT operations. As the images are updated, they are integrated into Netflixs infrastructure using a custom-built, web-based platform.
Continous Monitoring is in place so that if the deployment of the images fails, the new images are rolled back and traffic rerouted to the previous version. Continuous Monitoring is in place so that if the deployment of the images fails, the new images are rolled back and traffic rerouted to the previous version.
There is a great talk listed below that goes into more about the DOs and DONTs that Netflix live and die by within their teams. There is a great talk listed below that goes into more about the DOs and DONTs that Netflix lives and dies by within their teams.
## Etsy ## Etsy
As with many of us and many companies there was a real struggle around slow and painful deployments. In the same vein we might have also experienced working in companies that have lots of siloes and teams that are not really working well together. As with many of us and many companies, there was a real struggle around slow and painful deployments. In the same vein, we might have also experienced working in companies that have lots of siloes and teams that are not working well together.
From what I can make out at least from reading about Amazon and Netflix, Etsy might have adopted the letting developers deploy their own code around the end of 2009 which might have been before the other two mentioned. (interesting!) From what I can make out at least from reading about Amazon and Netflix, Etsy might have adopted the letting developers deploy their code around the end of 2009 which might have been before the other two were mentioned. (interesting!)
An interesting take away I read here was that they realised that when developers feel responsibility for deployment they also would take responsibility for application performance, uptime and other goals. An interesting takeaway I read here was that they realised that when developers feel responsible for deployment they also would take responsibility for application performance, uptime and other goals.
A learning culture is a key part of DevOps, even failure can be a success if lessons are learned. (not sure where this quote came from but it kind of makes sense!)
A learning culture is a key part to DevOps, even failure can be a success if lessons are learned. (not sure where this quote actually came from but it kind of makes sense!)
I have added some other stories where DevOps has changed the game within some of these massively successful companies. I have added some other stories where DevOps has changed the game within some of these massively successful companies.
## Resources ## Resources
- [How Netflix Thinks of DevOps](https://www.youtube.com/watch?v=UTKIT6STSVM) - [How Netflix Thinks of DevOps](https://www.youtube.com/watch?v=UTKIT6STSVM)
@ -63,7 +61,7 @@ I have added some other stories where DevOps has changed the game within some of
- DevOps is a combo of Development and Operations that allows a single team to manage the whole application development lifecycle that consists of **Development**, **Testing**, **Deployment**, **Operations**. - DevOps is a combo of Development and Operations that allows a single team to manage the whole application development lifecycle that consists of **Development**, **Testing**, **Deployment**, **Operations**.
- The main focus and aim of DevOps is to shorten the development lifecycle while delivering features, fixes and functionality frequently in close alignment with business objectives. - The main focus and aim of DevOps are to shorten the development lifecycle while delivering features, fixes and functionality frequently in close alignment with business objectives.
- DevOps is a software development approach through which software can be delivered and developed reliably and quickly. You may also see this referenced as **Continuous Development, Testing, Deployment, Monitoring** - DevOps is a software development approach through which software can be delivered and developed reliably and quickly. You may also see this referenced as **Continuous Development, Testing, Deployment, Monitoring**

View File

@ -11,7 +11,7 @@ id: 1048856
I think it is fair to say to be successful in the long term as a DevOps engineer you've got to know at least one programming language at a foundational level. I want to take this first session of this section to explore why this is such a critical skill to have, and hopefully, by the end of this week or section, you are going to have a better understanding of the why, how and what to do to progress with your learning journey. I think it is fair to say to be successful in the long term as a DevOps engineer you've got to know at least one programming language at a foundational level. I want to take this first session of this section to explore why this is such a critical skill to have, and hopefully, by the end of this week or section, you are going to have a better understanding of the why, how and what to do to progress with your learning journey.
I think if I was to ask out on social do you need to have programming skills for DevOps related roles, the answer will be most likely a hard yes? Let me know if you think otherwise? Ok but then a bigger question and this is where you won't get such a clear answer is which programming language? The most common answer I have seen here has been Python or increasingly more often, we're seeing Golang or Go should be the language that you learn. I think if I was to ask out on social do you need to have programming skills for DevOps related roles, the answer will be most likely a hard yes? Let me know if you think otherwise? Ok but then a bigger question and this is where you won't get such a clear answer which programming language? The most common answer I have seen here has been Python or increasingly more often, we're seeing Golang or Go should be the language that you learn.
To be successful in DevOps you have to have a good knowledge of programming skills is my takeaway from that at least. But we have to understand why we need it to choose the right path. To be successful in DevOps you have to have a good knowledge of programming skills is my takeaway from that at least. But we have to understand why we need it to choose the right path.
@ -42,7 +42,7 @@ An advantage of using a language like Python that is interpreted in a DevOps rol
## Go vs Python for DevOps ## Go vs Python for DevOps
Go Programs are statically linked, this means that when you compile a go program everything is included in a single binary executable, no external dependencies will be required that would need to be installed on the remote machine, this makes the deployment of go programs easy, compared to python program that uses external libraries you have to make sure that all those libraries are installed on the remote machine that you wish to run on. Go Programs are statically linked, this means that when you compile a go program everything is included in a single binary executable, and no external dependencies will be required that would need to be installed on the remote machine, this makes the deployment of go programs easy, compared to python program that uses external libraries you have to make sure that all those libraries are installed on the remote machine that you wish to run on.
Go is a platform-independent language, which means you can produce binary executables for *all the operating systems, Linux, Windows, macOS etc and very easy to do so. With Python, it is not as easy to create these binary executables for particular operating systems. Go is a platform-independent language, which means you can produce binary executables for *all the operating systems, Linux, Windows, macOS etc and very easy to do so. With Python, it is not as easy to create these binary executables for particular operating systems.
@ -52,7 +52,7 @@ Unlike Python which often requires the use of third party libraries to implement
This is by no way throwing Python under the bus I am just giving my reasons for choosing Go but they are not the above Go vs Python it's generally because it makes sense as the company I work for develops software in Go so that is why. This is by no way throwing Python under the bus I am just giving my reasons for choosing Go but they are not the above Go vs Python it's generally because it makes sense as the company I work for develops software in Go so that is why.
I will say that once you have or at least I am told as I am not many pages into this chapter right now, is that once you learn your first programming language it becomes easier to take on other languages. You're probably never going to have a single job in any company anywhere where you don't have to deal with manage, architect, orchestrating, debug JavaScript and Node JS applications. I will say that once you have or at least I am told as I am not many pages into this chapter right now, is that once you learn your first programming language it becomes easier to take on other languages. You're probably never going to have a single job in any company anywhere where you don't have to deal with managing, architect, orchestrating, debug JavaScript and Node JS applications.
## Resources ## Resources
@ -64,6 +64,6 @@ I will say that once you have or at least I am told as I am not many pages into
- [FreeCodeCamp - Learn Go Programming - Golang Tutorial for Beginners](https://www.youtube.com/watch?v=YS4e4q9oBaU&t=1025s) - [FreeCodeCamp - Learn Go Programming - Golang Tutorial for Beginners](https://www.youtube.com/watch?v=YS4e4q9oBaU&t=1025s)
- [Hitesh Choudhary - Complete playlist](https://www.youtube.com/playlist?list=PLRAV69dS1uWSR89FRQGZ6q9BR2b44Tr9N) - [Hitesh Choudhary - Complete playlist](https://www.youtube.com/playlist?list=PLRAV69dS1uWSR89FRQGZ6q9BR2b44Tr9N)
Now for the next 6 days of this topic my intention is to work through some of the resources listed above and document my notes for each day. You will notice that they are generally around 3 hours as a full course, I wanted to share my complete list so that if you have time you should move ahead and work through each one if time permits, I will be sticking to my learning hour each day. Now for the next 6 days of this topic, I intend to work through some of the resources listed above and document my notes for each day. You will notice that they are generally around 3 hours as a full course, I wanted to share my complete list so that if you have time you should move ahead and work through each one if time permits, I will be sticking to my learning hour each day.
See you on [Day 8](day08.md). See you on [Day 8](day08.md).

View File

@ -21,7 +21,7 @@ If we made it this far you probably know which workstation operating system you
Also note if you do have an older version of Go installed you will have to remove this before installing, Windows has this built into the installer and will remove and install as one. Also note if you do have an older version of Go installed you will have to remove this before installing, Windows has this built into the installer and will remove and install as one.
Once finished you should now open a command prompt/terminal and we want to check that we have Go installed. If you do not get the output that we see below then Go is not installed and you will need to retrace your steps. Once finished you should now open a command prompt/terminal and we want to check that we have to Go installed. If you do not get the output that we see below then Go is not installed and you will need to retrace your steps.
`go version` `go version`
@ -35,17 +35,17 @@ Did you check? Are you following along? You will probably get something like the
![](Images/Day8_Go5.png) ![](Images/Day8_Go5.png)
Ok, let's create that directory for ease I am going to use the mkdir command in my powershell terminal. We also need to create 3 folders within the Go folder as you will see also below. Ok, let's create that directory for ease I am going to use the mkdir command in my PowerShell terminal. We also need to create 3 folders within the Go folder as you will see also below.
![](Images/Day8_Go6.png) ![](Images/Day8_Go6.png)
Now we have Go installed and we have our Go working directory ready for action. We now need an integrated development environment (IDE) Now there are many out there available that you can use but the most common and the one I use is Visual Studio Code or Code. You can learn more about IDEs [here](https://www.youtube.com/watch?v=vUn5akOlFXQ). Now we have to Go installed and we have our Go working directory ready for action. We now need an integrated development environment (IDE) Now there are many out there available that you can use but the most common and the one I use is Visual Studio Code or Code. You can learn more about IDEs [here](https://www.youtube.com/watch?v=vUn5akOlFXQ).
If you have not downloaded and installed VSCode already on your workstation then you can do so by heading [here](https://code.visualstudio.com/download). As you can see below you have your different OS options. If you have not downloaded and installed VSCode already on your workstation then you can do so by heading [here](https://code.visualstudio.com/download). As you can see below you have your different OS options.
![](Images/Day8_Go7.png) ![](Images/Day8_Go7.png)
Much the same as with the Go installation we are going to download and install and keep the defaults. Once complete you can open VSCode and you can select Open File and navigate to our Go directory that we created above. Much the same as with the Go installation we are going to download and install and keep the defaults. Once complete you can open VSCode you can select Open File and navigate to our Go directory that we created above.
![](Images/Day8_Go8.png) ![](Images/Day8_Go8.png)
@ -55,13 +55,13 @@ Now you should see the three folders we also created earlier as well and what we
![](Images/Day8_Go9.png) ![](Images/Day8_Go9.png)
Pretty easy stuff I would say up till this point? Now we are going to create our first Go Program with no understanding about anything we put in this next phase. Pretty easy stuff I would say up till this point? Now we are going to create our first Go Program with no understanding of anything we put in this next phase.
Next create a file called `main.go` in your `Hello` folder. As soon as you hit enter on the main.go you will be asked if you want to install the Go extension and also packages you can also check that empty pkg file that we made a few steps back and notice that we should have some new packages in there now? Next, create a file called `main.go` in your `Hello` folder. As soon as you hit enter on the main.go you will be asked if you want to install the Go extension and also packages you can also check that empty pkg file that we made a few steps back and notice that we should have some new packages in there now?
![](Images/Day8_Go10.png) ![](Images/Day8_Go10.png)
Now let's get this Hello World app going, copy for the following code into your new main.go file and save that. Now let's get this Hello World app going, copy the following code into your new main.go file and save that.
``` ```
package main package main
@ -72,7 +72,7 @@ func main() {
fmt.Println("Hello #90DaysOfDevOps") fmt.Println("Hello #90DaysOfDevOps")
} }
``` ```
Now I appreciate that the above might make no sense at all, but we will cover more about functions, packages and more in later days. For now let's run our app. Back in the terminal and in our Hello folder we can now check that all is working. Using the command below we can check to see if our generic learning program is working. Now I appreciate that the above might make no sense at all, but we will cover more about functions, packages and more in later days. For now, let's run our app. Back in the terminal and in our Hello folder we can now check that all is working. Using the command below we can check to see if our generic learning program is working.
``` ```
go run main.go go run main.go
@ -103,7 +103,6 @@ Hello #90DaysOfDevOps
- [FreeCodeCamp - Learn Go Programming - Golang Tutorial for Beginners](https://www.youtube.com/watch?v=YS4e4q9oBaU&t=1025s) - [FreeCodeCamp - Learn Go Programming - Golang Tutorial for Beginners](https://www.youtube.com/watch?v=YS4e4q9oBaU&t=1025s)
- [Hitesh Choudhary - Complete playlist](https://www.youtube.com/playlist?list=PLRAV69dS1uWSR89FRQGZ6q9BR2b44Tr9N) - [Hitesh Choudhary - Complete playlist](https://www.youtube.com/playlist?list=PLRAV69dS1uWSR89FRQGZ6q9BR2b44Tr9N)
See you on [Day 9](day09.md). See you on [Day 9](day09.md).
![](Images/Day8_Go13.png) ![](Images/Day8_Go13.png)

View File

@ -2,10 +2,10 @@
title: '#90DaysOfDevOps - Let''s explain the Hello World code - Day 9' title: '#90DaysOfDevOps - Let''s explain the Hello World code - Day 9'
published: false published: false
description: 90DaysOfDevOps - Let's explain the Hello World code description: 90DaysOfDevOps - Let's explain the Hello World code
tags: "devops, 90daysofdevops, learning" tags: 'devops, 90daysofdevops, learning'
cover_image: null cover_image: null
canonical_url: null canonical_url: null
id: 1048732 id: 1099682
--- ---
## Let's explain the Hello World code ## Let's explain the Hello World code
@ -16,7 +16,7 @@ On [Day 8](day08.md) we walked through getting Go installed on your workstation
In this section, we are going to take a deeper look into the code and understand a few more things about the Go language. In this section, we are going to take a deeper look into the code and understand a few more things about the Go language.
### What is Compiling? ### What is Compiling?
Before we get into the [6 lines of the Hello World code](Go/hello.go) we need to have a bit of an understanding about compiling. Before we get into the [6 lines of the Hello World code](Go/hello.go) we need to have a bit of an understanding of compiling.
Programming languages that we commonly use such as Python, Java, Go and C++ are high-level languages. Meaning they are human-readable but when a machine is trying to execute a program it needs to be in a form that a machine can understand. We have to translate our human-readable code to machine code which is called compiling. Programming languages that we commonly use such as Python, Java, Go and C++ are high-level languages. Meaning they are human-readable but when a machine is trying to execute a program it needs to be in a form that a machine can understand. We have to translate our human-readable code to machine code which is called compiling.
@ -25,9 +25,9 @@ Programming languages that we commonly use such as Python, Java, Go and C++ are
From the above you can see what we did on [Day 8](day08.md) here, we created a simple Hello World main.go and we then used the command `go build main.go` to compile our executable. From the above you can see what we did on [Day 8](day08.md) here, we created a simple Hello World main.go and we then used the command `go build main.go` to compile our executable.
### What are packages? ### What are packages?
A package is a collection of source files in the same directory that are compiled together. We can simplify this further, a package is a bunch of .go files in the same directory. Remember our Hello folder from Day 8? If and when you get into more complex Go programs you might find that you have folder1 folder2 and folder3 containing different .go files that make up your program with multiple packages. A package is a collection of source files in the same directory that are compiled together. We can simplify this further, a package is a bunch of .go files in the same directory. Remember our Hello folder from Day 8? If and when you get into more complex Go programs you might find that you have folder1 folder2 and folder3 containing different.go files that make up your program with multiple packages.
We use packages so we can reuse other peoples code, we don't have to write everything from scratch. Maybe we are wanting a calculator as part of our program, you could probably find an existing Go Package that contains the mathematical functions that you could import into your code saving you a lot of time and effort in the long run. We use packages so we can reuse other people's code, we don't have to write everything from scratch. Maybe we are wanting a calculator as part of our program, you could probably find an existing Go Package that contains the mathematical functions that you could import into your code saving you a lot of time and effort in the long run.
Go encourages you to organise your code in packages so that it is easy to reuse and maintain source code. Go encourages you to organise your code in packages so that it is easy to reuse and maintain source code.
@ -44,17 +44,17 @@ A package can be named whatever you wish. We have to call this `main` as this is
Whenever we want to compile and execute our code we have to tell the machine where the execution needs to start. We do this by writing a function called main. The machine will look for a function called main to find the entry point of the program. Whenever we want to compile and execute our code we have to tell the machine where the execution needs to start. We do this by writing a function called main. The machine will look for a function called main to find the entry point of the program.
A function is a block of code that can do some specific task for and can be used across the program. A function is a block of code that can do some specific task and can be used across the program.
You can declare a function with any name using `func` but in this case we need to name it `main` as this is where the code starts. You can declare a function with any name using `func` but in this case, we need to name it `main` as this is where the code starts.
![](Images/Day9_Go4.png) ![](Images/Day9_Go4.png)
Next we are going to look at line 3 of our code, the import, this basically means you want to bring in another package to your main program. fmt is a standard package being used here provided by Go, this package contains the `Println()`function and because we have imported this we can use this in line 6. There are a number of standard packages you can include in your program and leverage or reuse them in your code saving you the hassle of having to write from scratch. [Go Standard Library](https://pkg.go.dev/std) Next, we are going to look at line 3 of our code, the import, this means you want to bring in another package to your main program. fmt is a standard package being used here provided by Go, this package contains the `Println()` function and because we have imported this we can use this in line 6. There are several standard packages you can include in your program and leverage or reuse them in your code saving you the hassle of having to write from scratch. [Go Standard Library](https://pkg.go.dev/std)
![](Images/Day9_Go5.png) ![](Images/Day9_Go5.png)
the `Println()` that we have here is a way in which to write to a standard output to the terminal where ever the executuable has been executed succesfully. Feel free to change the message in between the (). the `Println()` that we have here is a way in which to write standard output to the terminal where ever the executable has been executed successfully. Feel free to change the message in between the ().
![](Images/Day9_Go6.png) ![](Images/Day9_Go6.png)

View File

@ -2,7 +2,7 @@
title: '#90DaysOfDevOps - The Go Workspace - Day 10' title: '#90DaysOfDevOps - The Go Workspace - Day 10'
published: false published: false
description: 90DaysOfDevOps - The Go Workspace description: 90DaysOfDevOps - The Go Workspace
tags: "devops, 90daysofdevops, learning" tags: 'devops, 90daysofdevops, learning'
cover_image: null cover_image: null
canonical_url: null canonical_url: null
id: 1048701 id: 1048701
@ -22,7 +22,7 @@ The output should be similar to mine (with a different username may be) which is
``` ```
/home/michael/projects/go /home/michael/projects/go
``` ```
Then within here, we created 3 directories. **src**, **pkg** and **bin** Then here, we created 3 directories. **src**, **pkg** and **bin**
![](Images/Day10_Go1.png) ![](Images/Day10_Go1.png)
@ -55,7 +55,7 @@ To run our code we first must **compile** it. There are three ways to do this wi
Before we get to the above compile stage we need to take a look at what we get with the Go Installation. Before we get to the above compile stage we need to take a look at what we get with the Go Installation.
When we installed Go on Day 8 this installed something known as Go tools which consist of several programs that let us build and process our Go source files. One of the tools is `Go` When we installed Go on Day 8 we installed something known as Go tools which consist of several programs that let us build and process our Go source files. One of the tools is `Go`
It is worth noting that you can install additional tools that are not in the standard Go installation. It is worth noting that you can install additional tools that are not in the standard Go installation.
@ -67,7 +67,7 @@ You might also remember that we have already used at least two of these tools so
![](Images/Day10_Go7.png) ![](Images/Day10_Go7.png)
The ones we want to learn more about are build, install and run. The ones we want to learn more about are the build, install and run.
![](Images/Day10_Go8.png) ![](Images/Day10_Go8.png)

View File

@ -21,7 +21,7 @@ The first thing to consider here is that as we are building our app and we are w
- Variables are used to store values. - Variables are used to store values.
- Like a little box with our saved information or values. - Like a little box with our saved information or values.
- We can then use this variable across the program which also benefits that if this challenge or variable changes then we only have to change this in one place. Meaning we could translate this to other challenges we have in the community by just changing that one variable value. - We can then use this variable across the program which also benefits that if this challenge or variable changes then we only have to change this in one place. This means we could translate this to other challenges we have in the community by just changing that one variable value.
To declare this in our Go Program we define a value by using a **keyword** for variables. This will live within our `func main` block of code that you will see later. You can find more about [Keywords](https://go.dev/ref/spec#Keywords)here. To declare this in our Go Program we define a value by using a **keyword** for variables. This will live within our `func main` block of code that you will see later. You can find more about [Keywords](https://go.dev/ref/spec#Keywords)here.
@ -73,7 +73,7 @@ If we then go through that `go build` process again and run you will see below t
Finally, and this won't be the end of our program we will come back to this in [Day12](day12.md) to add more functionality. We now want to add another variable for the number of days we have completed the challenge. Finally, and this won't be the end of our program we will come back to this in [Day12](day12.md) to add more functionality. We now want to add another variable for the number of days we have completed the challenge.
Below I added `dayscomplete` variable with the number of days completed. Below I added the `dayscomplete` variable with the number of days completed.
``` ```
package main package main
@ -134,7 +134,7 @@ Go has three basic data types:
I found this resource super detailed on data types [Golang by example](https://golangbyexample.com/all-data-types-in-golang-with-examples/) I found this resource super detailed on data types [Golang by example](https://golangbyexample.com/all-data-types-in-golang-with-examples/)
I would also suggest [Techworld with Nana](https://www.youtube.com/watch?v=yyUHQIec83I&t=2023s) at this point covers in some detail a lot about the data types in Go. I would also suggest [Techworld with Nana](https://www.youtube.com/watch?v=yyUHQIec83I&t=2023s) at this point covers in detail a lot about the data types in Go.
If we need to define a type in our variable we can do this like so: If we need to define a type in our variable we can do this like so:
@ -148,7 +148,7 @@ Because Go implies variables where a value is given we can print out those value
``` ```
fmt.Printf("challenge is %T, daystotal is %T, dayscomplete is %T\n", conference, daystotal, dayscomplete) fmt.Printf("challenge is %T, daystotal is %T, dayscomplete is %T\n", conference, daystotal, dayscomplete)
``` ```
There are many different types of integer and float types the links above will cover off these in detail. There are many different types of integer and float types the links above will cover these in detail.
- **int** = whole numbers - **int** = whole numbers
- **unint** = positive whole numbers - **unint** = positive whole numbers
@ -164,6 +164,6 @@ There are many different types of integer and float types the links above will c
- [FreeCodeCamp - Learn Go Programming - Golang Tutorial for Beginners](https://www.youtube.com/watch?v=YS4e4q9oBaU&t=1025s) - [FreeCodeCamp - Learn Go Programming - Golang Tutorial for Beginners](https://www.youtube.com/watch?v=YS4e4q9oBaU&t=1025s)
- [Hitesh Choudhary - Complete playlist](https://www.youtube.com/playlist?list=PLRAV69dS1uWSR89FRQGZ6q9BR2b44Tr9N) - [Hitesh Choudhary - Complete playlist](https://www.youtube.com/playlist?list=PLRAV69dS1uWSR89FRQGZ6q9BR2b44Tr9N)
Next up we are going to start adding some user input functionality to our program so that we are asking how many days have been completed. Next up we are going to start adding some user input functionality to our program so that we are asked how many days have been completed.
See you on [Day 12](day12.md). See you on [Day 12](day12.md).

View File

@ -25,7 +25,7 @@ We are on day 12 and we would need to change that `dayscomplete` every day and c
Getting user input, we want to get the value of maybe a name and the number of days completed. For us to do this we can use another function from within the `fmt` package. Getting user input, we want to get the value of maybe a name and the number of days completed. For us to do this we can use another function from within the `fmt` package.
Recap on the `fmt` package, different functions for: formatted input and output (I/O) Recap on the `fmt` package, different functions for formatted input and output (I/O)
- Print Messages - Print Messages
- Collect User Input - Collect User Input
@ -80,4 +80,3 @@ Below is running this code.
- [Hitesh Choudhary - Complete playlist](https://www.youtube.com/playlist?list=PLRAV69dS1uWSR89FRQGZ6q9BR2b44Tr9N) - [Hitesh Choudhary - Complete playlist](https://www.youtube.com/playlist?list=PLRAV69dS1uWSR89FRQGZ6q9BR2b44Tr9N)
See you on [Day 13](day13.md). See you on [Day 13](day13.md).

View File

@ -9,7 +9,7 @@ id: 1048865
--- ---
## Tweet your progress with our new App ## Tweet your progress with our new App
On the final day of looking into this programming language, we have only just touched the surface here of the language but it is that start that I think we need to get interested and excited and want to dive more into it. On the final day of looking into this programming language, we have only just touched the surface here of the language but it is at that start that I think we need to get interested and excited and want to dive more into it.
Over the last few days, we have taken a small idea for an application and we have added functionality to it, in this session I want to take advantage of those packages we mentioned and create the functionality for our app to not only give you the update of your progress on screen but also send a tweet with the details of the challenge and your status. Over the last few days, we have taken a small idea for an application and we have added functionality to it, in this session I want to take advantage of those packages we mentioned and create the functionality for our app to not only give you the update of your progress on screen but also send a tweet with the details of the challenge and your status.
@ -30,7 +30,7 @@ Give your application a name
![](Images/Day13_Go3.png) ![](Images/Day13_Go3.png)
You will be then given these API tokens, it is important that you save these somewhere secure. (I have since deleted this app) We will need these later with our Go Application. You will be then given these API tokens, you must save these somewhere secure. (I have since deleted this app) We will need these later with our Go Application.
![](Images/Day13_Go4.png) ![](Images/Day13_Go4.png)
@ -50,7 +50,7 @@ Remember the code we are starting within our application as well [day13_example1
We now need to think about the code to get our output or message to Twitter in the form of a tweet. We are going to be using [go-twitter](https://github.com/dghubble/go-twitter) This is a Go client library for the Twitter API. We now need to think about the code to get our output or message to Twitter in the form of a tweet. We are going to be using [go-twitter](https://github.com/dghubble/go-twitter) This is a Go client library for the Twitter API.
To test this before putting this into our main application, I created a new directory in our `src` folder called go-twitter-bot, issued the `go mod init github.com/michaelcade/go-twitter-bot` on the folder which then created a `go.mod` file and then we can start writing our new main.go and test this out. To test this before putting this into our main application, I created a new directory in our `src` folder called go-twitter-bot, issued the `go mod init github.com/michaelcade/go-Twitter-bot` on the folder which then created a `go.mod` file and then we can start writing our new main.go and test this out.
We now need those keys, tokens and secrets we gathered from the Twitter developer portal. We are going to set these in our environment variables. This will depend on the OS you are running: We now need those keys, tokens and secrets we gathered from the Twitter developer portal. We are going to set these in our environment variables. This will depend on the OS you are running:

View File

@ -14,63 +14,50 @@ A lot of technologies start on Linux, especially if they are related to software
As well lots of open source projects, especially DevOps tools, were designed to run on Linux from the start. As well lots of open source projects, especially DevOps tools, were designed to run on Linux from the start.
From a DevOps perspective or in fact any operations role perspective you are going to come across Linux I would say mostly. There is a place for WinOps but the majority of the time you are going to be administering and deploying Linux servers. From a DevOps perspective or any operations role perspective, you are going to come across Linux I would say mostly. There is a place for WinOps but the majority of the time you are going to be administering and deploying Linux servers.
I have been using Linux on a daily basis for a number of years but my go to desktop machine has always been either macOS or Windows. However, when I moved into the Cloud Native role I am in now I took the plunge to make sure that my laptop was fully Linux based and my daily driver, whilst I still needed Windows for work-based applications and a lot of my audio and video gear does not run on Linux I was forcing myself to run a Linux desktop full time to get a better grasp of a lot of the things we are going to touch on over the next 7 days. I have been using Linux daily for several years but my go to desktop machine has always been either macOS or Windows. However, when I moved into the Cloud Native role I am in now I took the plunge to make sure that my laptop was fully Linux based and my daily driver, whilst I still needed Windows for work-based applications and a lot of my audio and video gear does not run on Linux I was forcing myself to run a Linux desktop full time to get a better grasp of a lot of the things we are going to touch on over the next 7 days.
## Getting Started ## Getting Started
I am not suggesting you do the same as me by any stretch as there are easier options and less destructive but I will say taking that full-time step forces you to learn faster on how to make things work on Linux. I am not suggesting you do the same as me by any stretch as there are easier options which are less destructive but I will say that taking that full-time step forces you to learn faster how to make things work on Linux.
For the majority of these 7 days, I am actually going to deploy a Virtual Machine in Virtual Box on my Windows machine. I am also going to deploy a desktop version of a Linux distribution, whereas a lot of the Linux servers you will be administering will likely be servers that come with no GUI and everything is shell-based. However, as I said at the start a lot of the tools that we covered throughout this whole 90 days started out on Linux I would also strongly encourage you to take the dive into running that Linux Desktop for that learning experience as well.
For the majority of these 7 days, I am going to deploy a Virtual Machine in Virtual Box on my Windows machine. I am also going to deploy a desktop version of a Linux distribution, whereas a lot of the Linux servers you will be administering will likely be servers that come with no GUI and everything is shell-based. However, as I said at the start a lot of the tools that we covered throughout this whole 90 days started on Linux I would also strongly encourage you to dive into running that Linux Desktop for that learning experience as well.
For the rest of this post, we are going to concentrate on getting a Ubuntu Desktop virtual machine up and running in our Virtual Box environment. Now we could just download [Virtual Box](https://www.virtualbox.org/) and grab the latest [Ubuntu ISO](https://ubuntu.com/download) from the sites linked and go ahead and build out our desktop environment but that wouldn't be very DevOps of us, would it? For the rest of this post, we are going to concentrate on getting a Ubuntu Desktop virtual machine up and running in our Virtual Box environment. Now we could just download [Virtual Box](https://www.virtualbox.org/) and grab the latest [Ubuntu ISO](https://ubuntu.com/download) from the sites linked and go ahead and build out our desktop environment but that wouldn't be very DevOps of us, would it?
Another good reason to use most Linux distributions is that they are free and open-source. We are also choosing Ubuntu as it is probably the most widely used distribution deployed not thinking about mobile devices and enterprise RedHat Enterprise servers. I might be wrong there but with CentOS and the history there I bet Ubuntu is high on the list and it's super simple. Another good reason to use most Linux distributions is that they are free and open-source. We are also choosing Ubuntu as it is probably the most widely used distribution deployed not thinking about mobile devices and enterprise RedHat Enterprise servers. I might be wrong there but with CentOS and the history there I bet Ubuntu is high on the list and it's super simple.
## Introducing HashiCorp Vagrant ## Introducing HashiCorp Vagrant
Vagrant is a CLI utility that manages the lifecycle of your virtual machines. We can use vagrant to spin up and down virtual machines across many different platforms including vSphere, Hyper-v, Virtual Box and also Docker. It does have other providers but we will stick with Virtual Box here so we are good to go.
Vagrant is a CLI utility that manages the lifecycle of your virtual machines. We can use vagrant to spin up and down virtual machines across many different platforms including vSphere, Hyper-v, Virtual Box and also Docker. It does have other providers but we will stick with that we are using Virtual Box here so we are good to go.
The first thing we need to do is get Vagrant installed on our machine, when you go to the downloads page you will see all the operating systems listed for your choice. [HashiCorp Vagrant](https://www.vagrantup.com/downloads) I am using Windows so I grabbed the binary for my system and went ahead and installed this on my system.
The first thing we need to do is get Vagrant installed on our machine, when you go to the downloads page you will see all the operating systems listed for your choice. [HashiCorp Vagrant](https://www.vagrantup.com/downloads) I am using Windows so I grabbed the binary for my system and went ahead and installed this to my system.
Next up we also need to get [Virtual Box](https://www.virtualbox.org/wiki/Downloads) installed. Again, this can also be installed on many different operating systems and a good reason to choose this and vagrant is that if you are running Windows, macOS, or Linux then we have you covered here.
Next up we also need to get [Virtual Box](https://www.virtualbox.org/wiki/Downloads) installed. Again this can also be installed on many different operating systems again a good reason to choose this and vagrant is that if you are running Windows, macOS, or Linux then we have you covered here.
Both installations are pretty straightforward and both have great communitites around them so feel free to reach out if you have issues and I can try and assist too.
Both installations are pretty straightforward. If you have issues both have great communities around them also feel free to reach out and I can try to assist also.
## Our first VAGRANTFILE ## Our first VAGRANTFILE
The VAGRANTFILE describes the type of machine we want to deploy. It also defines the configuration and provisioning for this machine.
The VAGRANTFILE describes the type of machine we want to deploy. It also defines how we want the configuration and provisioning of this machine need to look.
When it comes to saving these and organizing your VAGRANTFILEs I tend to put them in their folders in my workspace. You can see below how this looks on my system. Hopefully following this you will play around with Vagrant and see the ease of spinning up different systems, it is also great for that rabbit hole known as distro hopping for Linux Desktops.
When it comes to saving these and organizing your VAGRANTFILEs I tend to put them in their own folders in my workspace. You can see below how this looks on my system. Hopefully following this you will play around with Vagrant and see the ease of spinning up different systems, it is also great for that rabbit hole is known as distro hopping for Linux Desktops.
![](Images/Day14_Linux1.png) ![](Images/Day14_Linux1.png)
Let's take a look at that VAGRANTFILE and see what we are building.
Let's take a look at that VAGRANTFILE then and see what we are building.
``` ```
@ -85,7 +72,7 @@ Vagrant.configure("2") do |config|
v.cpus = 4 v.cpus = 4
v.customize ["modifyvm", :id, "--vram", "128mb"] v.customize ["modifyvm", :id, "--vram", "128"]
end end
@ -93,12 +80,10 @@ end
``` ```
This is a very simple VAGRANTFILE overall we are saying we want a specific "box" a box being possibly either a public image or private build of the system you are looking for. You can find a long list of "boxes" publicly available here in the [public catalog of Vagrant boxes](https://app.vagrantup.com/boxes/search) This is a very simple VAGRANTFILE overall. We are saying that we want a specific "box", a box being possibly either a public image or private build of the system you are looking for. You can find a long list of "boxes" publicly available here in the [public catalogue of Vagrant boxes](https://app.vagrantup.com/boxes/search)
Next line we're saying that we want to use a specific provider and in this case it's `VirtualBox`. We also define our machine's memory to `8GB` and the number of CPUs to `4`. My experience tells me that you may want to also add the following line if you experience display issues. This will set the video memory to what you want, I would ramp this right up to `128MB` but it depends on your system.
Next line we are saying we want to use a specific provider in this case it is `VirtualBox` and then we want to define our machine's memory to `8GB and our number of CPUs to `4`. My experience also tells me that you may want to also add the following line if you experience display issues. This will set the video memory to what you want, I would ramp this right up to `128MB but depends on your system.
``` ```
@ -110,63 +95,50 @@ v.customize ["modifyvm", :id, "--vram", ""]
I have also placed a copy of this specific vagrant file in the [Linux Folder](Linux/VAGRANTFILE) I have also placed a copy of this specific vagrant file in the [Linux Folder](Linux/VAGRANTFILE)
## Provisioning our Linux Desktop ## Provisioning our Linux Desktop
We are now ready to get our first machine up and running, in our workstation's terminal. In my case I am using PowerShell on my Windows machine. Navigate to your projects folder and where you will find your VAGRANTFILE. Once there you can type the command `vagrant up` and if everything's allright you will see something like this.
We are now ready to get our first machine up and running, in your workstations terminal. In my case I am using PowerShell on my Windows machine, navigate to your projects folder and where you will find your VAGRANTFILE. Once there you can type the command `vagrant up` and if everything is correct then you will see something like the below.
![](Images/Day14_Linux2.png) ![](Images/Day14_Linux2.png)
Another thing to add here is that the network will be set to `NAT` on your virtual machine. At this stage we don't need to know about NAT and I plan to have a whole session talking about it in the Networking session. Know that it is the easy button when it comes to getting a machine on your home network, it is also the default networking mode on Virtual Box. You can find out more in the [Virtual Box documentation](https://www.virtualbox.org/manual/ch06.html#network_nat)
Another thing to add here is that the network will be set to `NAT` on your virtual machine, at this stage we don't really need to know about NAT and I plan to have a whole session talking about in the next section about Networking. But know that it is the easy button when it comes to getting a machine on your home network, it is also the default networking mode on Virtual Box. You can find out more in the [Virtual Box documentation](https://www.virtualbox.org/manual/ch06.html#network_nat)
Once `vagrant up` is complete we can now use `vagrant ssh` to jump straight into the terminal of our new VM. Once `vagrant up` is complete we can now use `vagrant ssh` to jump straight into the terminal of our new VM.
![](Images/Day14_Linux3.png) ![](Images/Day14_Linux3.png)
This is where we will do most of our exploring over the next few days but I also want to dive into some customizations for your developer workstation that I have done and it makes your life much simpler when running this as your daily driver, and of course, are you really in DevOps unless you have a cool nonstandard terminal? This is where we will do most of our exploring over the next few days but I also want to dive into some customizations for your developer workstation that I have done and it makes your life much simpler when running this as your daily driver, and of course, are you really in DevOps unless you have a cool nonstandard terminal?
But just to confirm in Virtual Box you should see the login prompt when you select your VM. But just to confirm in Virtual Box you should see the login prompt when you select your VM.
![](Images/Day14_Linux4.png) ![](Images/Day14_Linux4.png)
Oh and if you made it this far and you have been asking "WHAT IS THE USERNAME & PASSWORD?" Oh and if you made it this far and you have been asking "WHAT IS THE USERNAME & PASSWORD?"
- Username = vagrant - Username = vagrant
- Password = vagrant - Password = vagrant
Tomorrow we are going to get into some of the commands and what they do, The terminal is going to be the place to make everything happen. Tomorrow we are going to get into some of the commands and what they do, The terminal is going to be the place to make everything happen.
## Resources ## Resources
- [Learn the Linux Fundamentals - Part 1](https://www.youtube.com/watch?v=kPylihJRG70) - [Learn the Linux Fundamentals - Part 1](https://www.youtube.com/watch?v=kPylihJRG70)
- [Linux for hackers (don't worry you don't need be a hacker!)](https://www.youtube.com/watch?v=VbEx7B_PTOE) - [Linux for hackers (don't worry you don't need to be a hacker!)](https://www.youtube.com/watch?v=VbEx7B_PTOE)
There are going to be lots of resources I find as we go through and much like the Go resources I am generally going to be keeping them to FREE content so we can all partake and learn here. There are going to be lots of resources I find as we go through and much like the Go resources I am generally going to be keeping them to FREE content so we can all partake and learn here.
As I mentioned next up we will take a look at the commands we might be using on a daily whilst in our Linux environments. As I mentioned next up we will take a look at the commands we might be using on a daily whilst in our Linux environments.
See you on [Day15](day15.md) See you on [Day15](day15.md)

View File

@ -2,18 +2,18 @@
title: '#90DaysOfDevOps - Linux Commands for DevOps (Actually everyone) - Day 15' title: '#90DaysOfDevOps - Linux Commands for DevOps (Actually everyone) - Day 15'
published: false published: false
description: 90DaysOfDevOps - Linux Commands for DevOps (Actually everyone) description: 90DaysOfDevOps - Linux Commands for DevOps (Actually everyone)
tags: "devops, 90daysofdevops, learning" tags: 'devops, 90daysofdevops, learning'
cover_image: null cover_image: null
canonical_url: null canonical_url: null
id: 1048834 id: 1048834
--- ---
## Linux Commands for DevOps (Actually everyone) ## Linux Commands for DevOps (Actually everyone)
I mentioned it [yesterday](day14.md) that we are going to be spending a lot of time in the terminal with some commands to get stuff done. I mentioned [yesterday](day14.md) that we are going to be spending a lot of time in the terminal with some commands to get stuff done.
I also mentioned that with our vagrant provisioned VM we can use `vagrant ssh` and gain access to our box. You will need to be in the same directory as we provisioned it from. I also mentioned that with our vagrant provisioned VM we can use `vagrant ssh` and gain access to our box. You will need to be in the same directory as we provisioned it from.
For SSH you won't need the username and password, you will only need that if you decide to login to the Virtual Box console. For SSH you won't need the username and password, you will only need that if you decide to log in to the Virtual Box console.
This is where we want to be as per below: This is where we want to be as per below:
@ -21,16 +21,16 @@ This is where we want to be as per below:
## Commands ## Commands
Obviously I cannot cover all the commands here, there are pages and pages of documentation that cover these but also if you are ever in your terminal and you just need to understand options to a specific command we have the `man` pages short for manual. We can use this to go through each of the commands we touch on during this post to find out more options for each one. We can run `man man` which will give you the help for manual pages. To escape the man pages you should press `q` for quit. I cannot cover all the commands here, there are pages and pages of documentation that cover these but also if you are ever in your terminal and you just need to understand options to a specific command we have the `man` pages short for manual. We can use this to go through each of the commands we touch on during this post to find out more options for each one. We can run `man man` which will give you the help for manual pages. To escape the man pages you should press `q` for quit.
![](Images/Day15_Linux2.png) ![](Images/Day15_Linux2.png)
![](Images/Day15_Linux3.png) ![](Images/Day15_Linux3.png)
`sudo` If you are familar with Windows and the right click `run as administrator` we can think of `sudo` as very much this. When you run a command with this command you will be running it as `root` it will prompt you for the password before running the command. `sudo` If you are familiar with Windows and the right click `run as administrator` we can think of `sudo` as very much this. When you run a command with this command you will be running it as `root` it will prompt you for the password before running the command.
![](Images/Day15_Linux4.png) ![](Images/Day15_Linux4.png)
For one off jobs like installing applications or services you might need that `sudo command` but what if you have several tasks to deal with and you want to live as `sudo` for a while? This is where you can use `sudo su` again the same as `sudo` once entered you will be prompted for your `root` password. In a test VM like ours this is fine but I would find it very hard for us to be rolling around as `root` for prolonged periods, bad things can happen. To get out of this elevated position you simply type in `exit` For one off jobs like installing applications or services, you might need that `sudo command` but what if you have several tasks to deal with and you want to live as `sudo` for a while? This is where you can use `sudo su` again the same as `sudo` once entered you will be prompted for your `root` password. In a test VM like ours, this is fine but I would find it very hard for us to be rolling around as `root` for prolonged periods, bad things can happen. To get out of this elevated position you simply type in `exit`
![](Images/Day15_Linux5.png) ![](Images/Day15_Linux5.png)
@ -38,11 +38,11 @@ I find myself using `clear` all the time, the `clear` command does exactly what
![](Images/Day15_Linux6.png) ![](Images/Day15_Linux6.png)
Let's now look at some commands where we can actually create things within our system and then visualise them in our terminal, first of all we have `mkdir` this will allow us to create a folder in our system. With the following command we can create a folder in our home directory called Day15 `mkdir Day15` Let's now look at some commands where we can actually create things within our system and then visualise them in our terminal, first of all, we have `mkdir` which will allow us to create a folder in our system. With the following command, we can create a folder in our home directory called Day15 `mkdir Day15`
![](Images/Day15_Linux7.png) ![](Images/Day15_Linux7.png)
With `cd` this allows us to change directory, so for us to move into our newly created directory we can do this with `cd Day15` tab can also be used to autocomplete the directory available. If we want to get back to where we started we can use `cd ..` With `cd` this allows us to change the directory, so for us to move into our newly created directory we can do this with `cd Day15` tab can also be used to autocomplete the directory available. If we want to get back to where we started we can use `cd ..`
![](Images/Day15_Linux8.png) ![](Images/Day15_Linux8.png)
@ -50,19 +50,19 @@ With `cd` this allows us to change directory, so for us to move into our newly c
![](Images/Day15_Linux9.png) ![](Images/Day15_Linux9.png)
I am sure we have all done it where we have navigated to the depths of our file system to a directory and not known where we are. `pwd` gives us the print out of the working directory, pwd as much as it looks like password it stands for print working directory. I am sure we have all done it where we have navigated to the depths of our file system to a directory and not known where we are. `pwd` gives us the printout of the working directory, pwd as much as it looks like password it stands for print working directory.
![](Images/Day15_Linux10.png) ![](Images/Day15_Linux10.png)
We know how to create folders and directories but how do we create files? We can create files using the `touch` command if we were to run `touch Day15` this would create a file. Ignore `mkdir` we are going see this again later. We know how to create folders and directories but how do we create files? We can create files using the `touch` command if we were to run `touch Day15` this would create a file. Ignore `mkdir` we are going to see this again later.
![](Images/Day15_Linux11.png) ![](Images/Day15_Linux11.png)
`ls` I can put my house on this, you will use this command so many times, this is going to list the all the files and folders in the current directory. Let's see if we can see that file we just created. `ls` I can put my house on this, you will use this command so many times, this is going to list all the files and folders in the current directory. Let's see if we can see that file we just created.
![](Images/Day15_Linux12.png) ![](Images/Day15_Linux12.png)
How can we find files on our Linux system? `locate` is going to allow us to search our file system. If we use `locate Day15` it will report back that location of the file. Bonus round is that if you know that the file does exist but you get a blank result then run `sudo updatedb` which will index all the files in the file system then run your `locate` again. If you do not have `locate` available to you, you can install it using this command `sudo apt install mlocate` How can we find files on our Linux system? `locate` is going to allow us to search our file system. If we use `locate Day15` it will report back the location of the file. The bonus round is that if you know that the file does exist but you get a blank result then run `sudo updatedb` which will index all the files in the file system then run your `locate` again. If you do not have `locate` available to you, you can install it using this command `sudo apt install mlocate`
![](Images/Day15_Linux13.png) ![](Images/Day15_Linux13.png)
@ -82,7 +82,7 @@ We have looked at moving files around but what if I just want to copy files from
![](Images/Day15_Linux17.png) ![](Images/Day15_Linux17.png)
We have created folders and files but we haven't actually put any contents into our folder, we can add contents a few ways but an easy way is `echo` we can also use `echo` to print out a lot of things in our terminal, I personally use echo a lot to print out system variables to know if they are set or not at least. we can use `echo "Hello #90DaysOfDevOps" > Day15` and this will add this to our file. We can also append to our file using `echo "Commands are fun!" >> Day15` We have created folders and files but we haven't put any contents into our folder, we can add contents a few ways but an easy way is `echo` we can also use `echo` to print out a lot of things in our terminal, I use echo a lot to print out system variables to know if they are set or not at least. we can use `echo "Hello #90DaysOfDevOps" > Day15` and this will add this to our file. We can also append to our file using `echo "Commands are fun!" >> Day15`
![](Images/Day15_Linux18.png) ![](Images/Day15_Linux18.png)
@ -110,7 +110,7 @@ You can easily add to your bash_profile:
``` ```
echo 'export HISTTIMEFORMAT="%d-%m-%Y %T "' >> ~/.bash_profile echo 'export HISTTIMEFORMAT="%d-%m-%Y %T "' >> ~/.bash_profile
``` ```
So as useful to allow the history file grow bigger: So as useful to allow the history file to grow bigger:
``` ```
echo 'export HISTSIZE=100000' >> ~/.bash_profile echo 'export HISTSIZE=100000' >> ~/.bash_profile
@ -119,7 +119,7 @@ echo 'export HISTFILESIZE=10000000' >> ~/.bash_profile
![](Images/Day15_Linux21.png) ![](Images/Day15_Linux21.png)
Need to change your password? `passwd` is going allow us to change our password. Note that when you add your password in like this when it is hidden it will not be shown in `history` however if your command has `-p PASSWORD` then this will be visible in your `history`. Need to change your password? `passwd` is going to allow us to change our password. Note that when you add your password like this when it is hidden it will not be shown in `history` however if your command has `-p PASSWORD` then this will be visible in your `history`.
![](Images/Day15_Linux22.png) ![](Images/Day15_Linux22.png)
@ -131,7 +131,7 @@ Creating a group again requires `sudo` and we can use `sudo groupadd DevOps` the
![](Images/Day15_Linux24.png) ![](Images/Day15_Linux24.png)
How do we add users to the `sudo` group, this would be a very rare occassion for this to happen but in order to do this it would be `usermod -a -G sudo NewUser` How do we add users to the `sudo` group, this would be a very rare occasion for this to happen but to do this it would be `usermod -a -G sudo NewUser`
### Permissions ### Permissions
@ -150,7 +150,7 @@ A full list:
You will also see `777` or `775` and these represent the same numbers as the list above but each one represents **User - Group - Everyone** You will also see `777` or `775` and these represent the same numbers as the list above but each one represents **User - Group - Everyone**
Let's take a look at our file. `ls -al Day15` you can see the 3 groups mentioned above, user and group has read & write but everyone only has read. Let's take a look at our file. `ls -al Day15` you can see the 3 groups mentioned above, user and group have read & write but everyone only has read.
![](Images/Day15_Linux25.png) ![](Images/Day15_Linux25.png)
@ -162,11 +162,11 @@ What about changing the owner of the file? We can use `chown` for this operation
![](Images/Day15_Linux27.png) ![](Images/Day15_Linux27.png)
A command that you will come across is `awk` where this comes in real use is when you have an output that you only need specific data from. like running `who` we get lines with information, but maybe we only need the names. We can run `who | awk '{print $1}'` to get just a list of that first column. A command that you will come across is `awk` which comes in real use when you have an output that you only need specific data from. like running `who` we get lines with information, but maybe we only need the names. We can run `who | awk '{print $1}'` to get just a list of that first column.
![](Images/Day15_Linux28.png) ![](Images/Day15_Linux28.png)
If you are looking to read streams of data from standard input, then generates and executes command lines; meaning it can take output of a command and passes it as argument of another command. `xargs` is a useful tool for this use case. If for example I want a list of all the Linux user accounts on the system I can run. `cut -d: -f1 < /etc/passwd` and get the long list we see below. If you are looking to read streams of data from standard input, then generate and execute command lines; meaning it can take the output of a command and passes it as an argument of another command. `xargs` is a useful tool for this use case. If for example, I want a list of all the Linux user accounts on the system I can run. `cut -d: -f1 < /etc/passwd` and get the long list we see below.
![](Images/Day15_Linux29.png) ![](Images/Day15_Linux29.png)
@ -178,31 +178,13 @@ I didn't mention the `cut` command either, this allows us to remove sections fro
![](Images/Day15_Linux31.png) ![](Images/Day15_Linux31.png)
Also to note if you type a command and you are no longer with happy with it and you want to start again just hit control + c and this will cancel that line and start you fresh. Also to note if you type a command and you are no longer happy with it and you want to start again just hit control + c and this will cancel that line and start you fresh.
## Resources ## Resources
- [Learn the Linux Fundamentals - Part 1](https://www.youtube.com/watch?v=kPylihJRG70) - [Learn the Linux Fundamentals - Part 1](https://www.youtube.com/watch?v=kPylihJRG70)
- [Linux for hackers (don't worry you don't need be a hacker!)](https://www.youtube.com/watch?v=VbEx7B_PTOE) - [Linux for hackers (don't worry you don't need to be a hacker!)](https://www.youtube.com/watch?v=VbEx7B_PTOE)
See you on [Day16](day16.md) See you on [Day16](day16.md)
This is a pretty heavy list already but I can safely say that I have used all of these commands in my day to day, be it from an administering Linux servers or in my Linux Desktop, it is very easy when you are in Windows or macOS to navigate the UI but in Linux Servers they are not there, everything is done through the terminal. This is a pretty heavy list already but I can safely say that I have used all of these commands in my day to day, be it from an administering Linux servers or on my Linux Desktop, it is very easy when you are in Windows or macOS to navigate the UI but in Linux Servers, they are not there, everything is done through the terminal.

View File

@ -2,7 +2,7 @@
title: '#90DaysOfDevOps - Managing your Linux System, Filesystem & Storage - Day 16' title: '#90DaysOfDevOps - Managing your Linux System, Filesystem & Storage - Day 16'
published: false published: false
description: '90DaysOfDevOps - Managing your Linux System, Filesystem & Storage' description: '90DaysOfDevOps - Managing your Linux System, Filesystem & Storage'
tags: "devops, 90daysofdevops, learning" tags: 'devops, 90daysofdevops, learning'
cover_image: null cover_image: null
canonical_url: null canonical_url: null
id: 1048702 id: 1048702
@ -81,7 +81,7 @@ On Windows, you have C: drive and that is what we consider the root. On Linux we
![](Images/Day16_Linux11.png) ![](Images/Day16_Linux11.png)
- `/etc` Likely the most important folder on your Linux system, this is where the majority of your configuration files. - `/etc` Likely the most important folder on your Linux system, this is where the majority of your configuration files are.
![](Images/Day16_Linux12.png) ![](Images/Day16_Linux12.png)
@ -89,7 +89,7 @@ On Windows, you have C: drive and that is what we consider the root. On Linux we
![](Images/Day16_Linux13.png) ![](Images/Day16_Linux13.png)
- `/lib` - We mentioned that `/bin` is where our binaries and executables live, `/lib` is where you will find the shared libraries for those. - `/lib` - We mentioned that `/bin` is where our binaries and executables live, and `/lib` is where you will find the shared libraries for those.
![](Images/Day16_Linux14.png) ![](Images/Day16_Linux14.png)

View File

@ -2,7 +2,7 @@
title: '#90DaysOfDevOps - Text Editors - nano vs vim - Day 17' title: '#90DaysOfDevOps - Text Editors - nano vs vim - Day 17'
published: false published: false
description: 90DaysOfDevOps - Text Editors - nano vs vim description: 90DaysOfDevOps - Text Editors - nano vs vim
tags: "devops, 90daysofdevops, learning" tags: 'devops, 90daysofdevops, learning'
cover_image: null cover_image: null
canonical_url: null canonical_url: null
id: 1048703 id: 1048703
@ -43,7 +43,7 @@ The first question might be "How do I exit vim?" that is going to be `escape` an
![](Images/Day17_Linux3.png) ![](Images/Day17_Linux3.png)
You start in `normal` mode, there are other modes `command, normal, visual, insert` , if we want to add the text we will need to switch from `normal` to `insert` we need to press `i` if you have added some text and would like to save these changes then you would hit escape and then `:wq` You start in `normal` mode, there are other modes `command, normal, visual, insert`, if we want to add the text we will need to switch from `normal` to `insert` we need to press `i` if you have added some text and would like to save these changes then you would hit escape and then `:wq`
![](Images/Day17_Linux4.png) ![](Images/Day17_Linux4.png)
@ -55,7 +55,7 @@ There is some cool fast functionality with vim that allows you to do menial task
![](Images/Day17_Linux6.png) ![](Images/Day17_Linux6.png)
Now we want to replace that word with 90DaysOfDevOps, we can do this by hitting `esc` and typing `:%s/Day/90DaysOfDevOps` Now we want to replace that word with 90DaysOfDevOps, we can do this by hitting `ESC` and typing `:%s/Day/90DaysOfDevOps`
![](Images/Day17_Linux7.png) ![](Images/Day17_Linux7.png)
@ -63,7 +63,7 @@ The outcome when you hit enter is that the word day is then replaced with 90Days
![](Images/Day17_Linux8.png) ![](Images/Day17_Linux8.png)
Copy and Paste was a big eye-opener for me. Copy is not copy it is yank. we can copy using `yy` on our keyboard in normal mode. `p` paste on the same line, `P` paste on a new line. Copy and Paste was a big eye-opener for me. Copy is not copied it is yanked. we can copy using `yy` on our keyboard in normal mode. `p` paste on the same line, `P` paste on a new line.
You can also delete these lines by choosing the number of lines you wish to delete followed by `dd` You can also delete these lines by choosing the number of lines you wish to delete followed by `dd`

View File

@ -2,7 +2,7 @@
title: '#90DaysOfDevOps - SSH & Web Server - Day 18' title: '#90DaysOfDevOps - SSH & Web Server - Day 18'
published: false published: false
description: 90DaysOfDevOps - SSH & Web Server description: 90DaysOfDevOps - SSH & Web Server
tags: "devops, 90daysofdevops, learning" tags: 'devops, 90daysofdevops, learning'
cover_image: null cover_image: null
canonical_url: null canonical_url: null
id: 1048733 id: 1048733
@ -37,13 +37,13 @@ If we use our client to connect with the correct credentials or SSH key then we
### Adding a bridged network adapter to our system ### Adding a bridged network adapter to our system
In order for us to use this with our current virtual box VM, we need to add a bridged network adapter to our machine. For us to use this with our current virtual box VM, we need to add a bridged network adapter to our machine.
Power down your virtual machine, right-click on your machine within Virtual Box and select settings. In the new window then select networking. Power down your virtual machine, right-click on your machine within Virtual Box and select settings. In the new window then select networking.
![](Images/Day18_Linux2.png) ![](Images/Day18_Linux2.png)
Now power your machine back on and you will now have an IP address on your local machine. You can confirm this with the `ip addr` command. Now power your machine back on and you will now have an IP address on your local machine. You can confirm this with the `IP addr` command.
### Confirming SSH server is running ### Confirming SSH server is running
@ -53,13 +53,13 @@ We know SSH is already configured on our machine as we have been using it with v
![](Images/Day18_Linux3.png) ![](Images/Day18_Linux3.png)
If your system does not have the SSH server then you can install it by issuing this command `sudo apt install openssh-server` If your system does not have the SSH server then you can install it by issuing this command `sudo apt install OpenSSH-server`
You then want to make sure that our SSH is allowed if the firewall is running. We can do this with `sudo ufw allow ssh` this is not required on our configuration as we automated this with our vagrant provisioning. You then want to make sure that our SSH is allowed if the firewall is running. We can do this with `sudo ufw allow ssh` this is not required on our configuration as we automated this with our vagrant provisioning.
### Remote Access - SSH Password ### Remote Access - SSH Password
Now that we have our SSH Server listening out on port 22 for any incoming connection requests and we have added the bridged networking we could use putty or an SSH client on our local machine to connect into our system using SSH. Now that we have our SSH Server listening out on port 22 for any incoming connection requests and we have added the bridged networking we could use putty or an SSH client on our local machine to connect to our system using SSH.
![](Images/Day18_Linux4.png) ![](Images/Day18_Linux4.png)
@ -87,9 +87,9 @@ I am not going to get into what `ed25519` is and means here but you can have a s
![](Images/Day18_Linux7.png) ![](Images/Day18_Linux7.png)
At this point we have our created SSH key stored in `C:\Users\micha/.ssh/` At this point, we have our created SSH key stored in `C:\Users\micha/.ssh/`
But in order to link this with our Linux VM we need to copy the key. We can do this by using the `ssh-copy-id vagrant@192.168.169.135` But to link this with our Linux VM we need to copy the key. We can do this by using the `ssh-copy-id vagrant@192.168.169.135`
I used Powershell to create my keys on my Windows client but there is no `ssh-copy-id` available here. There are ways in which you can do this on Windows and a small search online will find you an alternative, but I will just use git bash on my Windows machine to make the copy. I used Powershell to create my keys on my Windows client but there is no `ssh-copy-id` available here. There are ways in which you can do this on Windows and a small search online will find you an alternative, but I will just use git bash on my Windows machine to make the copy.
@ -111,7 +111,7 @@ there is a line in here with `PasswordAuthentication yes` this will be `#` comme
Not specifically related to what we have just done with SSH above but I wanted to include this as this is again another task that you might find a little daunting but it really should not be. Not specifically related to what we have just done with SSH above but I wanted to include this as this is again another task that you might find a little daunting but it really should not be.
We have our Linux playground VM and at this stage, we want to add an apache webserver to our VM so that we can host a simple website from it that serves out to my home network. Note that this web page will not be accessible from the internet, this can be done but it will not be covered here. We have our Linux playground VM and at this stage, we want to add an apache webserver to our VM so that we can host a simple website from it that serves my home network. Note that this web page will not be accessible from the internet, this can be done but it will not be covered here.
You might also see this referred to as a LAMP stack. You might also see this referred to as a LAMP stack.
@ -137,7 +137,7 @@ MySQL is a database in which we will be storing our data for our simple website.
### PHP ### PHP
PHP is a server-side scripting language, we will use this to interact with a MySQL database. The final installation is to get PHP and dependencies installed using `sudo apt-get install php libapache2-mod-php php-mysql` PHP is a server-side scripting language, we will use this to interact with a MySQL database. The final installation is to get PHP and dependencies installed using `sudo apt-get install php libapache2-mod-php php-mysql`
The first configuration change we want to make it out of the box apache is using index.html and we want it to use index.php instead. The first configuration change we want to make out of the box apache is using index.html and we want it to use index.php instead.
We are going to use `sudo nano /etc/apache2/mods-enabled/dir.conf` and we are going to move index.php to the first item in the list. We are going to use `sudo nano /etc/apache2/mods-enabled/dir.conf` and we are going to move index.php to the first item in the list.
@ -163,8 +163,7 @@ Now navigate to your Linux VM IP again with the additional 90Days.php on the end
### WordPress Installation ### WordPress Installation
I then walked through this tutorial to get WordPress up on our LAMP stack, some commands are shown below if not shown correctly in the walkthrough [How to install wordpress on Ubuntu with LAMP](https://blog.ssdnodes.com/blog/how-to-install-wordpress-on-ubuntu-18-04-with-lamp-tutorial/) I then walked through this tutorial to get WordPress up on our LAMP stack, some commands are shown below if not shown correctly in the walkthrough [How to install WordPress on Ubuntu with LAMP](https://blog.ssdnodes.com/blog/how-to-install-wordpress-on-ubuntu-18-04-with-lamp-tutorial/)
`sudo mysql -u root -p` `sudo mysql -u root -p`
@ -190,7 +189,7 @@ I then walked through this tutorial to get WordPress up on our LAMP stack, some
`sudo rm latest.tar.gz` `sudo rm latest.tar.gz`
At this point you are Step 4 in the linked article, you will need to follow the steps to make sure all correct permissions are in place for the WordPress directory. At this point you are in Step 4 in the linked article, you will need to follow the steps to make sure all correct permissions are in place for the WordPress directory.
Because this is internal only you do not need to "generate security keys" in this step. Move to Step 5 which is changing the Apache configuration to WordPress. Because this is internal only you do not need to "generate security keys" in this step. Move to Step 5 which is changing the Apache configuration to WordPress.

View File

@ -15,7 +15,7 @@ BASH - **B**ourne **A**gain **Sh**ell
We could almost dedicate a whole section of 7 days to shell scripting much like the programming languages, bash gives us the capability of working alongside other automation tools to get things done. We could almost dedicate a whole section of 7 days to shell scripting much like the programming languages, bash gives us the capability of working alongside other automation tools to get things done.
I still speak to a lot of people where they have set up some complex shell scripts to make something happen and they rely on this script for some of the most important things in the business, I am not saying we need to understand shell/bash scripting for this purpose, this is not the way. But we should learn shell/bash scripting to work alongside our automation tools and for ad-hoc tasks. I still speak to a lot of people who have set up some complex shell scripts to make something happen and they rely on this script for some of the most important things in the business, I am not saying we need to understand shell/bash scripting for this purpose, this is not the way. But we should learn shell/bash scripting to work alongside our automation tools and for ad-hoc tasks.
An example of this that we have used in this section could be the VAGRANTFILE we used to create our VM, we could wrap this into a simple bash script that deleted and renewed this every Monday morning so that we have a fresh copy of our Linux VM every week, we could also add all the software stack that we need on said Linux machine and so on all through this one bash script. An example of this that we have used in this section could be the VAGRANTFILE we used to create our VM, we could wrap this into a simple bash script that deleted and renewed this every Monday morning so that we have a fresh copy of our Linux VM every week, we could also add all the software stack that we need on said Linux machine and so on all through this one bash script.
@ -42,7 +42,7 @@ However, you may see other paths listed in already created shell scripts which c
- `#!/bin/bash` - `#!/bin/bash`
- `#!/usr/bin/env bash` - `#!/usr/bin/env bash`
In the next line in our script, I like to add a comment and add the purpose of the script or at least some information about me. You can do this by using the `#` This allows us to comment out particular lines in our code and provide descriptions for what the upcoming commands will be doing. I find the more notes the better for the user experience especially if you are sharing this. In the next line in our script, I like to add a comment and add the purpose of the script or at least some information about me. You can do this by using the `#` This allows us to comment on particular lines in our code and provide descriptions of what the upcoming commands will be doing. I find the more notes the better for the user experience especially if you are sharing this.
I sometimes use figlet, a program we installed earlier in the Linux section to create some asci art to kick things off in our scripts. I sometimes use figlet, a program we installed earlier in the Linux section to create some asci art to kick things off in our scripts.
@ -73,7 +73,7 @@ Now we can run our script again using `./90DaysOfDevOps.sh` after running the sc
Pretty basic stuff but you can start to see hopefully how this could be used to call on other tools as part of ways to make your life easier and automate things. Pretty basic stuff but you can start to see hopefully how this could be used to call on other tools as part of ways to make your life easier and automate things.
### Variables, Conditionals ### Variables, Conditionals
A lot of this section is really a repeat to what we covered when we were learning Golang but I think its worth us diving in here again. A lot of this section is a repeat of what we covered when we were learning Golang but I think it's worth us diving in here again.
- ### Variables - ### Variables
@ -152,7 +152,7 @@ We might also use bash scripting to determine information about files and folder
- `-d file` True if the file is a directory - `-d file` True if the file is a directory
- `-e file` True if the file exists - `-e file` True if the file exists
- `-f file` True if the provided string is a file - `-f file` True if the provided string is a file
- `g file` True if the group id is set on a file - `-g file` True if the group id is set on a file
- `-r file` True if the file is readable - `-r file` True if the file is readable
- `-s file` True if the file has a non-zero size - `-s file` True if the file has a non-zero size
@ -182,14 +182,14 @@ I found this amazing repository on GitHub that has what seems to be an endless a
**Requirements**: **Requirements**:
- A user can be passed in as a command line argument. - A user can be passed in as a command line argument.
- A user is created with the name of command line argument. - A user is created with the name of the command line argument.
- A password can be parsed in as a command line argument. - A password can be parsed as a command line argument.
- The password is set for the user - The password is set for the user
- A message of successful account creation is displayed. - A message of successful account creation is displayed.
Let's start with creating our shell script with `touch create_user.sh` Let's start with creating our shell script with `touch create_user.sh`
Before we move on lets also make this executable using `chmod +x create_user.sh` Before we move on let's also make this executable using `chmod +x create_user.sh`
then we can use `nano create_user.sh` to start editing our script for the scenario we have been set. then we can use `nano create_user.sh` to start editing our script for the scenario we have been set.
@ -216,7 +216,7 @@ Next up we can take that second requirement "A user is created with the name of
#A user can be passed in as a command line argument #A user can be passed in as a command line argument
echo "$1 user account being created." echo "$1 user account being created."
#A user is created with the name of command line argument #A user is created with the name of the command line argument
sudo useradd -m "$1" sudo useradd -m "$1"
``` ```
@ -227,7 +227,7 @@ We can then check this account has been created with the `awk -F: '{ print $1}'
![](Images/Day19_Linux11.png) ![](Images/Day19_Linux11.png)
Our next requirement is "A password can be parsed in as a command line argument." First of all we are not going to ever do this in production it is more for us to work through a list of requirements in the lab to understand. Our next requirement is "A password can be parsed as a command line argument." First of all, we are not going to ever do this in production it is more for us to work through a list of requirements in the lab to understand.
``` ```
#! /usr/bin/bash #! /usr/bin/bash
@ -235,10 +235,10 @@ Our next requirement is "A password can be parsed in as a command line argument.
#A user can be passed in as a command line argument #A user can be passed in as a command line argument
echo "$1 user account being created." echo "$1 user account being created."
#A user is created with the name of command line argument #A user is created with the name of the command line argument
sudo useradd -m "$1" sudo useradd -m "$1"
#A password can be parsed in as a command line argument. #A password can be parsed as a command line argument.
sudo chpasswd <<< "$1":"$2" sudo chpasswd <<< "$1":"$2"
``` ```
@ -248,9 +248,9 @@ You can see from the below image that we executed our script it created our user
![](Images/Day19_Linux12.png) ![](Images/Day19_Linux12.png)
The final requirement is "A message of successful account creation is displayed." We actually already have this in the top line of our code and we can see on the above screen shot that we have `90DaysOfDevOps user account being created` is shown. This was left from our testing with the `$1` parameter. The final requirement is "A message of successful account creation is displayed." We already have this in the top line of our code and we can see on the above screenshot that we have a `90DaysOfDevOps user account being created` shown. This was left from our testing with the `$1` parameter.
Now this script can be used to quickly onboard and set up new users on to our Linux systems. But maybe instead of a few of the historic people having to work through this and then having to get other people their new usernames or passwords we could add some user input that we have previously covered earlier on to capture our variables. Now, this script can be used to quickly onboard and set up new users on to our Linux systems. But maybe instead of a few of the historic people having to work through this and then having to get other people their new usernames or passwords we could add some user input that we have previously covered earlier on to capture our variables.
``` ```
#! /usr/bin/bash #! /usr/bin/bash
@ -263,10 +263,10 @@ read password
#A user can be passed in as a command line argument #A user can be passed in as a command line argument
echo "$username user account being created." echo "$username user account being created."
#A user is created with the name of command line argument #A user is created with the name of the command line argument
sudo useradd -m $username sudo useradd -m $username
#A password can be parsed in as a command line argument. #A password can be parsed as a command line argument.
sudo chpasswd <<< $username:$password sudo chpasswd <<< $username:$password
``` ```

View File

@ -2,7 +2,7 @@
title: '#90DaysOfDevOps - Dev workstation setup - All the pretty things - Day 20' title: '#90DaysOfDevOps - Dev workstation setup - All the pretty things - Day 20'
published: false published: false
description: 90DaysOfDevOps - Dev workstation setup - All the pretty things description: 90DaysOfDevOps - Dev workstation setup - All the pretty things
tags: "devops, 90daysofdevops, learning" tags: 'devops, 90daysofdevops, learning'
cover_image: null cover_image: null
canonical_url: null canonical_url: null
id: 1048734 id: 1048734
@ -17,7 +17,7 @@ I have put together a YouTube video walking through the rest as some people migh
[![Click to access YouTube Video](Images/Day20_YouTube.png)](https://youtu.be/jeEslAtHfKc) [![Click to access YouTube Video](Images/Day20_YouTube.png)](https://youtu.be/jeEslAtHfKc)
Out of the box our system will look something like the below: Out of the box, our system will look something like the below:
![](Images/Day20_Linux1.png) ![](Images/Day20_Linux1.png)
@ -28,7 +28,7 @@ We can also see our default bash shell below,
A lot of this comes down to dotfiles something we will cover in this final Linux session of the series. A lot of this comes down to dotfiles something we will cover in this final Linux session of the series.
### dotfiles ### dotfiles
First up I want to dig into dotfiles, I have said in a previous day that Linux is made up of configuration files. These dotfiles are configuration files for your Linux system and applications. First up I want to dig into dotfiles, I have said on a previous day that Linux is made up of configuration files. These dotfiles are configuration files for your Linux system and applications.
I will also add that dotfiles are not just used to customise and make your desktop look pretty, there are also dotfile changes and configurations that will help you with productivity. I will also add that dotfiles are not just used to customise and make your desktop look pretty, there are also dotfile changes and configurations that will help you with productivity.
@ -59,7 +59,7 @@ I selected `1` to the above question and now we have some more options.
![](Images/Day20_Linux5.png) ![](Images/Day20_Linux5.png)
You can see from this menu you we can make some out of the box edits to make ZSH configured to our needs. You can see from this menu that we can make some out of the box edits to make ZSH configured to our needs.
If you exit the wizard with a `0` and then use the `ls -al | grep .zshrc` you should see we have a new configuration file. If you exit the wizard with a `0` and then use the `ls -al | grep .zshrc` you should see we have a new configuration file.
@ -83,11 +83,11 @@ Let's get Oh My ZSH installed, we have a few options with `curl` `wget` or `fetc
`sh -c "$(curl -fsSL https://raw.githubusercontent.com/ohmyzsh/ohmyzsh/master/tools/install.sh)"` `sh -c "$(curl -fsSL https://raw.githubusercontent.com/ohmyzsh/ohmyzsh/master/tools/install.sh)"`
When you have run the above command you should see some output like below. When you have run the above command you should see some output like the below.
![](Images/Day20_Linux7.png) ![](Images/Day20_Linux7.png)
Now we can move on to start putting a theme in for our experience, there are well over 100 bundled with Oh My ZSH but my go to for all of my applications and everything is the dracula theme. Now we can move on to start putting a theme in for our experience, there are well over 100 bundled with Oh My ZSH but my go-to for all of my applications and everything is the Dracula theme.
I also want to add that these two plugins are a must when using Oh My ZSH. I also want to add that these two plugins are a must when using Oh My ZSH.
@ -129,23 +129,23 @@ A short list of the programs I install on the machine using `apt`
### Dracula theme ### Dracula theme
This site is the only theme I am using at the moment. Looks clear, clean and everything looks great. [Dracula Theme](https://draculatheme.com/) It also has you covered when you have lots of other programs you use on your machine. This site is the only theme I am using at the moment. Looks clear, and clean and everything looks great. [Dracula Theme](https://draculatheme.com/) It also has you covered when you have lots of other programs you use on your machine.
From the link above we can search for zsh on the site and you will find at least two options. From the link above we can search for zsh on the site and you will find at least two options.
Follow the instructions listed to insall either manually or using git. Then you will need to finally edit your `.zshrc` configuration file as per below. Follow the instructions listed to install either manually or using git. Then you will need to finally edit your `.zshrc` configuration file as per below.
![](Images/Day20_Linux8.png) ![](Images/Day20_Linux8.png)
You are next going to want the [Gnome Terminal Dracula theme](https://draculatheme.com/gnome-terminal) with all instructions available here as well. You are next going to want the [Gnome Terminal Dracula theme](https://draculatheme.com/gnome-terminal) with all instructions available here as well.
It would actually take a long time for me to document each and every step so I created a video walkthrough of the process. (**Click on the image below**) It would take a long time for me to document every step so I created a video walkthrough of the process. (**Click on the image below**)
[![](Images/Day20_YouTube.png)](https://youtu.be/jeEslAtHfKc) [![](Images/Day20_YouTube.png)](https://youtu.be/jeEslAtHfKc)
If you made it this far, then we have now finished our Linux section of the #90DaysOfDevOps. Once again I am open for feedback and additions to resources here. If you made it this far, then we have now finished our Linux section of the #90DaysOfDevOps. Once again I am open to feedback and additions to resources here.
I also thought on this it was easier to show you a lot of the steps through video vs writing them down here, what do you think to this? I do have a goal to work back through these days and where possible creating video walkthroughs to add in and better maybe explain and show some of the things we have covered. What do you think? I also thought on this it was easier to show you a lot of the steps through video vs writing them down here, what do you think about this? I do have a goal to work back through these days and where possible create video walkthroughs to add in and better maybe explain and show some of the things we have covered. What do you think?
## Resources ## Resources

View File

@ -9,24 +9,24 @@ id: 1048761
--- ---
## The Big Picture: DevOps and Networking ## The Big Picture: DevOps and Networking
Welcome to Day 21! We are going to be getting into Networking over the next 7 days, Networking and DevOps is the overarching theme but we will need to get into some of the networking fundamentals as well. Welcome to Day 21! We are going to be getting into Networking over the next 7 days, Networking and DevOps are the overarching themes but we will need to get into some of the networking fundamentals as well.
Ultimately as we have said previously DevOps is about a culture and process change within your organisations this as we have discussed can be Virtual Machines, Containers, Kubernetes but it can also be the network, If we are using those DevOps principles for our infrastructure that has to include the network more to the point from a DevOps point of view you also need to know about the network as in the different topologies and networking tools and stacks that we have available. Ultimately as we have said previously DevOps is about a culture and process change within your organisation this as we have discussed can be Virtual Machines, Containers, or Kubernetes but it can also be the network, If we are using those DevOps principles for our infrastructure that has to include the network more to the point from a DevOps point of view you also need to know about the network as in the different topologies and networking tools and stacks that we have available.
I would argue that we should have our networking devices configured using infrastructure as code and have everything automated like we would our virtual machines, but in order to do that we have to have a good understanding of what we are automating. I would argue that we should have our networking devices configured using infrastructure as code and have everything automated like we would our virtual machines, but to do that we have to have a good understanding of what we are automating.
### What is NetDevOps | Network DevOps? ### What is NetDevOps | Network DevOps?
You may also hear the terms Network DevOps or NetDevOps. Maybe you are already a Network engineer and have a great grasp on the network components within the infrastructure you understand the elements used around networking such as DHCP, DNS, NAT etc etc. You will also have a good understanding around the hardware or software defined networking options, switches, routers etc etc. You may also hear the terms Network DevOps or NetDevOps. Maybe you are already a Network engineer and have a great grasp on the network components within the infrastructure you understand the elements used around networking such as DHCP, DNS, NAT etc. You will also have a good understanding of the hardware or software-defined networking options, switches, routers etc.
But if you are not a network engineer then we probably need to get a foundational knowledge across the board on some of those areas so that we can understand the end goal of Network DevOps. But if you are not a network engineer then we probably need to get foundational knowledge across the board in some of those areas so that we can understand the end goal of Network DevOps.
But in regards to those terms we can think of NetDevOps or Network DevOps as applying the DevOps Principles and Practices to the network, applying version control and automation tools to the network creation, testing, monitoring, and deployments. But in regards to those terms, we can think of NetDevOps or Network DevOps as applying the DevOps Principles and Practices to the network, applying version control and automation tools to the network creation, testing, monitoring, and deployments.
If we think of Network DevOps of having to require automation, we mentioned before about DevOps breaking down the siloes between teams. If the networking teams do not change to a similar model and process then they become the bottleneck or even the failure overall. If we think of Network DevOps as having to require automation, we mentioned before about DevOps breaking down the siloes between teams. If the networking teams do not change to a similar model and process then they become the bottleneck or even the failure overall.
Using the automation principles around provisioning, configuration, testing, version control and deployment is a great start. Automation is overall going to enable speed of deployment, stability of the networking infrastructure and consistent improvement as well as the process being shared across multiple environments once they have been tested. Such as a fully tested Network Policy that has been fully tested on one environment can be used quickly in another location because of the nature of this being in code vs a manually authored process which it might have been before. Using the automation principles around provisioning, configuration, testing, version control and deployment is a great start. Automation is overall going to enable speed of deployment, stability of the networking infrastructure and consistent improvement as well as the process being shared across multiple environments once they have been tested. Such as a fully tested Network Policy that has been fully tested on one environment can be used quickly in another location because of the nature of this being in code vs a manually authored process which it might have been before.
A really good view point and outline of this thinking can be found here. [Network DevOps](https://www.thousandeyes.com/learning/techtorials/network-devops) A really good viewpoint and outline of this thinking can be found here. [Network DevOps](https://www.thousandeyes.com/learning/techtorials/network-devops)
## Networking The Basics ## Networking The Basics
@ -34,7 +34,7 @@ Let's forget the DevOps side of things to begin with here and we now need to loo
### Network Devices ### Network Devices
**Host** are any devices which sends or recieve traffic. **Host** are any devices which send or receive traffic.
![](Images/Day21_Networking1.png) ![](Images/Day21_Networking1.png)
@ -55,13 +55,13 @@ A logical group of hosts which require similar connectivity.
![](Images/Day21_Networking4.png) ![](Images/Day21_Networking4.png)
**Router** facilitate communication between networks. If we said before that a switch looks after communication within a network a router allows us to join these networks together or at least give them access to each other if permitted. **Router** facilitates communication between networks. As we said before that a switch looks after communication within a network a router allows us to join these networks together or at least give them access to each other if permitted.
A router can provide a traffic control point (security, filtering, redirecting) More and more switches also provide some of these functions now. A router can provide a traffic control point (security, filtering, redirecting) More and more switches also provide some of these functions now.
Routers learn which networks they are attached to. This is known as routes, a routing table is all the networks a router knows about. Routers learn which networks they are attached to. These are known as routes, a routing table is all the networks a router knows about.
A router has an IP address in the networks they are attached to. This IP is also going to be each hosts way out of their local network also known as a gateway. A router has an IP address in the networks they are attached to. This IP is also going to be each host's way out of their local network also known as a gateway.
Routers also create the hierarchy in networks I mentioned earlier. Routers also create the hierarchy in networks I mentioned earlier.
@ -75,7 +75,7 @@ Routers also create the hierarchy in networks I mentioned earlier.
**Switching** is the process of moving data within networks. **Switching** is the process of moving data within networks.
- A Switch is a device who's primary purpose is switching. - A Switch is a device whose primary purpose is switching.
This is very much a foundational overview of devices as we know there are many different Network Devices such as: This is very much a foundational overview of devices as we know there are many different Network Devices such as:
@ -90,7 +90,7 @@ This is very much a foundational overview of devices as we know there are many d
Although all of these devices are going to perform Routing and/or Switching. Although all of these devices are going to perform Routing and/or Switching.
Over the next few days we are going to get to know a little more about this list. Over the next few days, we are going to get to know a little more about this list.
- OSI Model - OSI Model
- Network Protocols - Network Protocols

View File

@ -9,44 +9,44 @@ id: 1049037
--- ---
## The OSI Model - The 7 Layers ## The OSI Model - The 7 Layers
The overall purpose of networking as an industry is to allow two hosts to share data before networking if I want to get data from this host to this host I'd have to plug something into this host walk it over to the other host plug it into the other host. The overall purpose of networking as an industry is to allow two hosts to share data. Before networking if I want to get data from this host to this host I'd have to plug something into this host walk it over to the other host and plug it into the other host.
Networking allows us to automate this by allowing the host to share data automatically across the wire for these hosts to do this they must follow a set of rules. Networking allows us to automate this by allowing the host to share data automatically across the wire for these hosts to do this they must follow a set of rules.
This is no different than any language English has a set of rules that two English speakers must follow Spanish has its own set of rules French has its own set of rules while networking also has its own set of rules This is no different than any language. English has a set of rules that two English speakers must follow. Spanish has its own set of rules. French has its own set of rules, while networking also has its own set of rules
The rules for networking are divided into seven different layers and those layers are known as the OSI model. The rules for networking are divided into seven different layers and those layers are known as the OSI model.
### Introduction to the OSI Model ### Introduction to the OSI Model
The OSI Model (Open Systems Interconnection Model) is a framework used to describe the functions of a networking system. The OSI model characterises computing functions into a universal set of rules and requirements in order to support interoperability between different products and software. In the OSI reference model, the communications between a computing system are split into seven different abstraction layers: **Physical, Data Link, Network, Transport, Session, Presentation, and Application**. The OSI Model (Open Systems Interconnection Model) is a framework used to describe the functions of a networking system. The OSI model characterises computing functions into a universal set of rules and requirements to support interoperability between different products and software. In the OSI reference model, the communications between a computing system are split into seven different abstraction layers: **Physical, Data Link, Network, Transport, Session, Presentation, and Application**.
![](Images/Day22_Networking1.png) ![](Images/Day22_Networking1.png)
### Physical ### Physical
Layer 1 in the OSI model and this is known as physical, the premise of being able to get data from one host to another through a means be it physical cable or we could also consider Wi-Fi in this layer as well. We might also see some more legacy hardware seen here around hubs and repeaters in order to transport the data from one host to another. Layer 1 in the OSI model and this is known as physical, the premise of being able to get data from one host to another through a means be it physical cable or we could also consider Wi-Fi in this layer as well. We might also see some more legacy hardware seen here around hubs and repeaters to transport the data from one host to another.
![](Images/Day22_Networking2.png) ![](Images/Day22_Networking2.png)
### Data Link ### Data Link
Layer 2, the data link enables node to node transfer where data is packaged into frames. There is also a level of error correcting that might have occurred at the physical layer. This is also where we introduce or first see MAC addresses. Layer 2, the data link enables a node to node transfer where data is packaged into frames. There is also a level of error correcting that might have occurred at the physical layer. This is also where we introduce or first see MAC addresses.
This is where we see the first mention of switches that we covered in our first day of networking on [Day 21](day21.md) This is where we see the first mention of switches that we covered on our first day of networking on [Day 21](day21.md)
![](Images/Day22_Networking3.png) ![](Images/Day22_Networking3.png)
### Network ### Network
You have likely heard the term layer 3 switches or layer 2 switches. In our OSI model Layer 3, the Network has a goal of end to end delivery, this is where we see our IP addresses also mentioned in the first day overview. You have likely heard the term layer 3 switches or layer 2 switches. In our OSI model Layer 3, the Network has a goal of an end to end delivery, this is where we see our IP addresses also mentioned in the first-day overview.
Routers and hosts exist at layer 3, remember the router is the ability to route between multiple networks. Anything with an IP could be considered Layer 3. Routers and hosts exist at layer 3, remember the router is the ability to route between multiple networks. Anything with an IP could be considered Layer 3.
![](Images/Day22_Networking4.png) ![](Images/Day22_Networking4.png)
So why do we need addressing schemes on both Layer 2 and 3? (MAC Addresses vs IP Addresses) So why do we need addressing schemes on both Layers 2 and 3? (MAC Addresses vs IP Addresses)
If we think about getting data from one host to another, each host has an IP address but there are several switches and routers in between. Each of the devices has that layer 2 MAC address. If we think about getting data from one host to another, each host has an IP address but there are several switches and routers in between. Each of the devices has that layer 2 MAC address.
The layer 2 MAC address will go from host to switch/router only, it is focused on hops where as the layer 3 IP addresses will stay with that packet of data until it reaches its end host. (End to End) The layer 2 MAC address will go from host to switch/router only, it is focused on hops whereas the layer 3 IP addresses will stay with that packet of data until it reaches its end host. (End to End)
IP Addresses - Layer 3 = End to End Delivery IP Addresses - Layer 3 = End to End Delivery
@ -55,16 +55,16 @@ MAC Addresses - Layer 2 = Hop to Hop Delivery
Now there is a network protocol that we will get into but not today called ARP(Address Resolution Protocol) which links our Layer3 and Layer2 addresses. Now there is a network protocol that we will get into but not today called ARP(Address Resolution Protocol) which links our Layer3 and Layer2 addresses.
### Transport ### Transport
Service to Service delivery, Layer 4 is there to distinguish data streams. In the same way that Layer 3 and Layer 2 both had their addressing schemes in Layer 4 we have ports. Service to Service delivery, Layer 4 is there to distinguish data streams. In the same way that Layer 3 and Layer 2 both had their addressing schemes, in Layer 4 we have ports.
![](Images/Day22_Networking5.png) ![](Images/Day22_Networking5.png)
### Session, Presentation, Application ### Session, Presentation, Application
Distinction between Layers 5,6,7 is or had become somewhat vague. The distinction between Layers 5,6,7 is or had become somewhat vague.
It is worth looking at the [TCP IP Model](https://www.geeksforgeeks.org/tcp-ip-model/) to get a more recent understanding. It is worth looking at the [TCP IP Model](https://www.geeksforgeeks.org/tcp-ip-model/) to get a more recent understanding.
Let's now try and explain what's actually happening when hosts are communicating to each other using this networking stack. This host has an application that's going to generate data that is meant to be sent to another host. Let's now try and explain what's happening when hosts are communicating with each other using this networking stack. This host has an application that's going to generate data that is meant to be sent to another host.
The source host is going to go through is what's known as the encapsulation process. That data will be first sent to layer 4. The source host is going to go through is what's known as the encapsulation process. That data will be first sent to layer 4.
@ -72,8 +72,8 @@ Layer 4 is going to add a header to that data which can facilitate the goal of l
This may also be known as a segment (Data and Port) This may also be known as a segment (Data and Port)
This segment is going to be passed down the osi stack to layer 3, the network layer, the network layer is going to add another header to this data. This segment is going to be passed down the OSI stack to layer 3, the network layer, and the network layer is going to add another header to this data.
This header is going to facilitate the goal of layer 3 which is end to end delivery meaning in this header you will have a source ip address and a destination ip, the header plus data may also be referred to as a packet. This header is going to facilitate the goal of layer 3 which is the end to end delivery meaning in this header you will have a source IP address and a destination IP, the header plus data may also be referred to as a packet.
Layer 3 will then take that packet and hand it off to layer 2, layer 2 will once again add another header to that data to accomplish layer 2's goal of hop to hop delivery meaning this header will include a source and destination mac address. Layer 3 will then take that packet and hand it off to layer 2, layer 2 will once again add another header to that data to accomplish layer 2's goal of hop to hop delivery meaning this header will include a source and destination mac address.
This is known as a frame when you have the layer 2 header and data. This is known as a frame when you have the layer 2 header and data.
@ -86,7 +86,7 @@ I did mention above the naming for each layer of header plus data but decided to
![](Images/Day22_Networking7.png) ![](Images/Day22_Networking7.png)
Obviously the Application sending the data is being sent somewhere so the receiving in some what in reverse to get that back up the stack and into the receiving host. The Application sending the data is being sent somewhere so the receiving is somewhat in reverse to get that back up the stack and into the receiving host.
![](Images/Day22_Networking8.png) ![](Images/Day22_Networking8.png)

View File

@ -2,7 +2,7 @@
title: '#90DaysOfDevOps - Network Protocols - Day 23' title: '#90DaysOfDevOps - Network Protocols - Day 23'
published: false published: false
description: 90DaysOfDevOps - Network Protocols description: 90DaysOfDevOps - Network Protocols
tags: "devops, 90daysofdevops, learning" tags: 'devops, 90daysofdevops, learning'
cover_image: null cover_image: null
canonical_url: null canonical_url: null
id: 1048704 id: 1048704
@ -21,13 +21,13 @@ Connects IP addresses to fixed physical machine addresses, also known as MAC add
- FTP - File Transfer Protocol - FTP - File Transfer Protocol
Allows for the transfer of files from source to destination. Generally this process is authenticated but there is the ability if configured to use anonymous access. You will more frequently now see FTPS which provides SSL/TLS connectivity to FTP servers from the client for better security. This protocol would be found in the Application layer of the OSI Model. Allows for the transfer of files from source to destination. Generally, this process is authenticated but there is the ability if configured to use anonymous access. You will more frequently now see FTPS which provides SSL/TLS connectivity to FTP servers from the client for better security. This protocol would be found in the Application layer of the OSI Model.
![](Images/Day23_Networking2.png) ![](Images/Day23_Networking2.png)
- SMTP - Simple Mail Transfer Protocol - SMTP - Simple Mail Transfer Protocol
Used for email transmission, mail servers use SMTP to send and recieve mail messages. You will still find even with Microsoft 365 that the SMTP protocol is used for the same purpose. Used for email transmission, mail servers use SMTP to send and receive mail messages. You will still find even with Microsoft 365 that the SMTP protocol is used for the same purpose.
![](Images/Day23_Networking3.png) ![](Images/Day23_Networking3.png)
@ -39,19 +39,19 @@ HTTP is the foundation of the internet and browsing content. Giving us the abili
- SSL - Secure Sockets Layer | TLS - Transport Layer Security - SSL - Secure Sockets Layer | TLS - Transport Layer Security
TLS has taken over from SSL, TLS is a [Cryptographic Protocol]() that provides security communications over a network. It can and will be found in mail, im and other applications but most commonly it is used to secure HTTPS. TLS has taken over from SSL, TLS is a [Cryptographic Protocol]() that provides secure communications over a network. It can and will be found in the mail, Instant Messaging and other applications but most commonly it is used to secure HTTPS.
![](Images/Day23_Networking5.png) ![](Images/Day23_Networking5.png)
- HTTPS - HTTP secured with SSL/TLS - HTTPS - HTTP secured with SSL/TLS
An extension of HTTP, used for secure communications over a network, HTTPS is encrypted with TLS as mentioned above. The focus here was to bring authenticaion, privacy and integrity whilst data is exchanged between hosts. An extension of HTTP, used for secure communications over a network, HTTPS is encrypted with TLS as mentioned above. The focus here was to bring authentication, privacy and integrity whilst data is exchanged between hosts.
![](Images/Day23_Networking6.png) ![](Images/Day23_Networking6.png)
- DNS - Domain Name System - DNS - Domain Name System
The DNS is used to map human-freindly domain names for example we all know [google.com](https://google.com) but if you were to open a browser and put in [8.8.8.8](https://8.8.8.8) you would get Google as we pretty much know it. However good luck trying to remember all of the IP addresses for all of your websites where some of them we even use google to find information. The DNS is used to map human-friendly domain names for example we all know [google.com](https://google.com) but if you were to open a browser and put in [8.8.8.8](https://8.8.8.8) you would get Google as we pretty much know it. However good luck trying to remember all of the IP addresses for all of your websites where some of them we even use google to find information.
This is where DNS comes in, it ensures that hosts, services and other resources are reachable. This is where DNS comes in, it ensures that hosts, services and other resources are reachable.
@ -72,15 +72,15 @@ There are 4 things that we need on every host for it to be able to achieve both
We have covered IP address being a unique address for your host on the network it resides, we can think of this as our house number. We have covered IP address being a unique address for your host on the network it resides, we can think of this as our house number.
Subnet mask we will cover shortly, but you can think of this as post code or zip code. Subnet mask we will cover shortly, but you can think of this as postcode or zip code.
Default gateway is the IP of our router generally on our network providing us with that Layer 3 connectivity. You could think of this as the single road that allows us out of our street. A default gateway is the IP of our router generally on our network providing us with that Layer 3 connectivity. You could think of this as the single road that allows us out of our street.
Then we have DNS as we just covered to help us convert complicated public IP addresses to more suitable and rememberable domain names. Maybe we can think of this as the giant sorting office to make sure we get the right post. Then we have DNS as we just covered to help us convert complicated public IP addresses to more suitable and rememberable domain names. Maybe we can think of this as the giant sorting office to make sure we get the right post.
As I said each host requires these 4 things, if you have 1000 or 10,000 hosts then that is going to take you a very long time to determine each one of these individually. This is where DHCP comes in and allows you to determine a scope for your network and then this protocol will distribute to all available hosts in your network. As I said each host requires these 4 things, if you have 1000 or 10,000 hosts then that is going to take you a very long time to determine each one of these individually. This is where DHCP comes in and allows you to determine a scope for your network and then this protocol will distribute to all available hosts in your network.
Another example, you head into a coffee shop, grab a coffee and sit down with your laptop or your phone lets call that your host. You connect your host to the coffee shop wifi and you gain access to the internet, messages and mail start pinging through and you can navigate web pages and social media. When you connected to the coffee shop wifi your machine would have picked up a DHCP address either from a dedicated DHCP server or most likely from the router also handling DHCP. Another example is you head into a coffee shop, grab a coffee and sit down with your laptop or your phone let's call that your host. You connect your host to the coffee shop wifi and you gain access to the internet, messages and mail start pinging through and you can navigate web pages and social media. When you connected to the coffee shop wifi your machine would have picked up a DHCP address either from a dedicated DHCP server or most likely from the router also handling DHCP.
![](Images/Day23_Networking8.png) ![](Images/Day23_Networking8.png)
@ -90,17 +90,17 @@ A subnet is a logical subdivision of an IP network.
Subnets break large networks into smaller, more manageable networks that run more efficiently. Subnets break large networks into smaller, more manageable networks that run more efficiently.
Each subnet is a logical subdivision of the bigger network. Connected devices with enough subnet share common IP address identifier, enabling them to communicate with each other. Each subnet is a logical subdivision of the bigger network. Connected devices with enough subnet share common IP address identifiers, enabling them to communicate with each other.
Routers manage communication between subnets. Routers manage communication between subnets.
The size of a subnet depends on the connectivity requirements and the network technology used. The size of a subnet depends on the connectivity requirements and the network technology used.
An organisation is responsible for determining its number and size of the subnets within the limits of address space An organisation is responsible for determining the number and size of the subnets within the limits of address space
available, and the details remain local to that organisation. Subnets can also be segmented into even smaller subnets for things like Point to Point links, or for sub networks supporting a few devices. available, and the details remain local to that organisation. Subnets can also be segmented into even smaller subnets for things like Point to Point links, or subnetworks supporting a few devices.
Among other advantages, segmenting large Among other advantages, segmenting large
networks into subnets enables IP address networks into subnets enable IP address
reallocation and relieves network congestion, streamlining, network communication and efficiency. reallocation and relieves network congestion, streamlining, network communication and efficiency.
Subnets can also improve network security. Subnets can also improve network security.
@ -108,12 +108,9 @@ If a section of a network is compromised, it can be quarantined, making it diffi
![](Images/Day23_Networking9.png) ![](Images/Day23_Networking9.png)
## Resources ## Resources
- [Computer Networking full course](https://www.youtube.com/watch?v=IPvYjXCsTg8) - [Computer Networking full course](https://www.youtube.com/watch?v=IPvYjXCsTg8)
- [Practical Networking](http://www.practicalnetworking.net/) - [Practical Networking](http://www.practicalnetworking.net/)
See you on [Day 24](day24.md) See you on [Day 24](day24.md)

View File

@ -2,7 +2,7 @@
title: '#90DaysOfDevOps - Network Automation - Day 24' title: '#90DaysOfDevOps - Network Automation - Day 24'
published: false published: false
description: 90DaysOfDevOps - Network Automation description: 90DaysOfDevOps - Network Automation
tags: "devops, 90daysofdevops, learning" tags: 'devops, 90daysofdevops, learning'
cover_image: null cover_image: null
canonical_url: null canonical_url: null
id: 1048805 id: 1048805
@ -24,9 +24,9 @@ To break this down you would need to identify how the task or process that you'r
"If you don't know where you are going, any road will take you there." "If you don't know where you are going, any road will take you there."
Having a framework or design structure that you're trying to achieve knowing what your end goal is and then working step by step towards achieving that goal measuring the automation success at various stages based on the business outcomes. Have a framework or design structure that you're trying to achieve know what your end goal is and then work step by step towards achieving that goal measuring the automation success at various stages based on the business outcomes.
Build concepts modelled around existing applications there's no need to design the concepts around automation in a bubble because they need to be applied to your application, your service, your infrastructure, so begin to build the concepts and model them around your existing infrastructure, you're existing applications. Build concepts modelled around existing applications there's no need to design the concepts around automation in a bubble because they need to be applied to your application, your service, and your infrastructure, so begin to build the concepts and model them around your existing infrastructure, you're existing applications.
### Approach to Networking Automation ### Approach to Networking Automation
@ -39,7 +39,7 @@ We should identify the tasks and perform a discovery on network change requests
We should then divide tasks and analyse how different network functions work and interact with each other. We should then divide tasks and analyse how different network functions work and interact with each other.
- Infrastructure/Network team receives change tickets at multiple layers to deploy applications. - The infrastructure/Network team receives change tickets at multiple layers to deploy applications.
- Based on Network services, divide them into different areas and understand how they interact with each other. - Based on Network services, divide them into different areas and understand how they interact with each other.
- Application Optimisation - Application Optimisation
- ADC (Application Delivery Controller) - ADC (Application Delivery Controller)
@ -55,7 +55,7 @@ Reusable policies, define and simplify reusable service tasks, processes and inp
- Simplifying the deployment process will reduce the time to market for both new and existing workloads. - Simplifying the deployment process will reduce the time to market for both new and existing workloads.
- Once you have a standard process, it can be sequenced and aligned to individual requests for a multi-threaded approach and delivery. - Once you have a standard process, it can be sequenced and aligned to individual requests for a multi-threaded approach and delivery.
Combine the policies with business-specific activities. How does implementing this policy help the business? Saves time? Saves Money? Provides better business outcome? Combine the policies with business-specific activities. How does implementing this policy help the business? Saves time? Saves Money? Provides a better business outcome?
- Ensure that service tasks are interoperable. - Ensure that service tasks are interoperable.
- Associate the incremental service tasks so that they align to create business services. - Associate the incremental service tasks so that they align to create business services.
@ -80,7 +80,7 @@ Orchestrate the network service!
The good news here is that for the most part, the tools we use here for Network automation are generally the same that we will use for other areas of automation or what we have already covered so far or what we will cover in future sessions. The good news here is that for the most part, the tools we use here for Network automation are generally the same that we will use for other areas of automation or what we have already covered so far or what we will cover in future sessions.
Operating System - As I have throughout this challenge, I am focusing on doing most of my learning with a Linux OS, those reasons were given in the Linux section but almost all of the tooling that we will touch albeit cross-OS platform maybe today they all started as Linux based applications or tools, to begin with. Operating System - As I have throughout this challenge, I am focusing on doing most of my learning with a Linux OS, those reasons were given in the Linux section but almost all of the tooling that we will touch albeit cross-OS platforms maybe today they all started as Linux based applications or tools, to begin with.
Integrated Development Environment (IDE) - Again not much to say here other than throughout I would suggest Visual Studio Code as your IDE, based on the extensive plugins that are available for so many different languages. Integrated Development Environment (IDE) - Again not much to say here other than throughout I would suggest Visual Studio Code as your IDE, based on the extensive plugins that are available for so many different languages.
@ -131,7 +131,7 @@ Analyse APIs - Postman is a great tool for analysing RESTful APIs. Helps to buil
[Network Test Automation](https://pubhub.devnetcloud.com/media/genie-feature-browser/docs/#/) [Network Test Automation](https://pubhub.devnetcloud.com/media/genie-feature-browser/docs/#/)
Over the next 3 days, I am planning to get more hands-on around some of the things we have covered and put some work in around Python and Network automation. Over the next 3 days, I am planning to get more hands-on with some of the things we have covered and put some work in around Python and Network automation.
We have nowhere near covered all of the networking topics so far but wanted to make this broad enough to follow along and still keep learning from the resources I am adding below. We have nowhere near covered all of the networking topics so far but wanted to make this broad enough to follow along and still keep learning from the resources I am adding below.

View File

@ -13,9 +13,9 @@ Python is the standard language used for automated network operations.
Whilst it is not only for network automation it seems to be everywhere when you are looking for resources and as previously mentioned if it's not Python then it's generally Ansible which is written also in Python. Whilst it is not only for network automation it seems to be everywhere when you are looking for resources and as previously mentioned if it's not Python then it's generally Ansible which is written also in Python.
I think I have mentioned this already but during the "Learn a programming language" section I chose Golang over Python for reasons around my company are developing in Go so that was a good reason for me to learn but if that was not the case then Python would have taken that time. I think I have mentioned this already but during the "Learn a programming language" section I chose Golang over Python for reasons around my company is developing in Go so that was a good reason for me to learn but if that was not the case then Python would have taken that time.
- Readability and ease of use - It seems that Python seems to just make sense. There doesn't seem to be the requirements around `{}` in the code to start and end blocks. Couple this with a strong IDE like VS Code you have a pretty easy start when wanting to run some python code. - Readability and ease of use - It seems that Python seems just makes sense. There don't seem to be the requirements around `{}` in the code to start and end blocks. Couple this with a strong IDE like VS Code you have a pretty easy start when wanting to run some python code.
Pycharm might be another IDE worth mentioning here. Pycharm might be another IDE worth mentioning here.
@ -59,7 +59,7 @@ We are going to do everything on EVE-NG with the community edition.
The community edition comes in ISO and OVF formats for [download](https://www.eve-ng.net/index.php/download/) The community edition comes in ISO and OVF formats for [download](https://www.eve-ng.net/index.php/download/)
We will be using the OVF download but with the ISO there is the option to build out on a bare metal server without the need of a hypervisor. We will be using the OVF download but with the ISO there is the option to build out on a bare metal server without the need for a hypervisor.
![](Images/Day25_Networking1.png) ![](Images/Day25_Networking1.png)
@ -81,11 +81,11 @@ Open VMware Workstation and then select `file` and `open`
![](Images/Day25_Networking2.png) ![](Images/Day25_Networking2.png)
When you download the EVE-NG OVF Image it is going to be within a compressed file. Extract the contents out into its folder so it looks like. When you download the EVE-NG OVF Image it is going to be within a compressed file. Extract the contents out into its folder so it looks like this.
![](Images/Day25_Networking3.png) ![](Images/Day25_Networking3.png)
Navigate to the location that you downloaded the EVE-NG OVF image to and begin the import. Navigate to the location where you downloaded the EVE-NG OVF image and begin the import.
Give it a recognisable name and store the virtual machine somewhere on your system. Give it a recognisable name and store the virtual machine somewhere on your system.
@ -93,7 +93,7 @@ Give it a recognisable name and store the virtual machine somewhere on your syst
When the import is complete increase the number of processors to 4 and the memory allocated to 8 GB. (This should be the case after import with the latest version if not then edit VM settings) When the import is complete increase the number of processors to 4 and the memory allocated to 8 GB. (This should be the case after import with the latest version if not then edit VM settings)
Also, make sure the Virtualise Intel VT-x/EPT or AMD-V/RVI checkbox is enabled. This option instructs VMware workstation to pass the virtualisation flags to the guest OS (nested virtualisation) This was the issue I was having with GNS3 with Virtual Box even though my CPU allows this. Also, make sure the Virtualise Intel VT-x/EPT or AMD-V/RVI checkbox is enabled. This option instructs the VMware workstation to pass the virtualisation flags to the guest OS (nested virtualisation) This was the issue I was having with GNS3 with Virtual Box even though my CPU allows this.
![](Images/Day25_Networking5.png) ![](Images/Day25_Networking5.png)
@ -111,7 +111,7 @@ When you want to go back and use WSL2 then you will need to run this command and
`bcdedit /set hypervisorlaunchtype auto` `bcdedit /set hypervisorlaunchtype auto`
Both of these commands should be ran as administrator! Both of these commands should be run as administrator!
Ok back to the show, You should now have a powered-on machine in VMware Workstation and you should have a prompt looking similar to this. Ok back to the show, You should now have a powered-on machine in VMware Workstation and you should have a prompt looking similar to this.

View File

@ -2,7 +2,7 @@
title: '#90DaysOfDevOps - Building our Lab - Day 26' title: '#90DaysOfDevOps - Building our Lab - Day 26'
published: false published: false
description: 90DaysOfDevOps - Building our Lab description: 90DaysOfDevOps - Building our Lab
tags: "devops, 90daysofdevops, learning" tags: 'devops, 90daysofdevops, learning'
cover_image: null cover_image: null
canonical_url: null canonical_url: null
id: 1048762 id: 1048762
@ -21,7 +21,7 @@ There is also a client pack that allows us to choose which application is used w
Quick Tip: If you are using Linux as your client then there is this [client pack](https://github.com/SmartFinn/eve-ng-integration). Quick Tip: If you are using Linux as your client then there is this [client pack](https://github.com/SmartFinn/eve-ng-integration).
The install is a straightforward next,next and I would suggest leaving the defaults. The install is straightforward next, next and I would suggest leaving the defaults.
### Obtaining network images ### Obtaining network images
@ -53,7 +53,7 @@ Inside the EVE-NG web interface, we are going to create our new network topology
#### Adding our Nodes to EVE-NG #### Adding our Nodes to EVE-NG
When you first log in to EVE-NG you will see a screen like below, we want to start by creating our first lab. When you first log in to EVE-NG you will see a screen like the below, we want to start by creating our first lab.
![](Images/Day26_Networking2.png) ![](Images/Day26_Networking2.png)

View File

@ -2,7 +2,7 @@
title: '#90DaysOfDevOps - Getting Hands-On with Python & Network - Day 27' title: '#90DaysOfDevOps - Getting Hands-On with Python & Network - Day 27'
published: false published: false
description: 90DaysOfDevOps - Getting Hands-On with Python & Network description: 90DaysOfDevOps - Getting Hands-On with Python & Network
tags: "devops, 90daysofdevops, learning" tags: 'devops, 90daysofdevops, learning'
cover_image: null cover_image: null
canonical_url: null canonical_url: null
id: 1048735 id: 1048735
@ -15,7 +15,7 @@ We will be using an SSH tunnel to connect to our devices from our client vs teln
## Access our virtual emulated environment ## Access our virtual emulated environment
For us to interact with our switches we either need a workstation inside the EVE-NG network and you can deploy a Linux box there with Python installed to perform your automation ([Resource for setting up Linux inside EVE-NG](https://www.youtube.com/watch?v=3Qstk3zngrY)) or you can do something like me and define a cloud for access from your workstation. For us to interact with our switches we either need a workstation inside the EVE-NG network or you can deploy a Linux box there with Python installed to perform your automation ([Resource for setting up Linux inside EVE-NG](https://www.youtube.com/watch?v=3Qstk3zngrY)) or you can do something like me and define a cloud for access from your workstation.
![](Images/Day27_Networking3.png) ![](Images/Day27_Networking3.png)
@ -25,13 +25,13 @@ To do this, we have right-clicked on our canvas and we have selected network and
However, we do not have anything inside this network so we need to add connections from the new network to each of our devices. (My networking knowledge needs more attention and I feel that you could just do this next step to the top router and then have connectivity to the rest of the network through this one cable?) However, we do not have anything inside this network so we need to add connections from the new network to each of our devices. (My networking knowledge needs more attention and I feel that you could just do this next step to the top router and then have connectivity to the rest of the network through this one cable?)
I have then logged on to each of our devices and I have run through the following commands for the interfaces applicable for where the cloud comes in. I have then logged on to each of our devices and I have run through the following commands for the interfaces applicable to where the cloud comes in.
``` ```
enable enable
config t config t
int gi0/0 int gi0/0
ip add dhcp IP add DHCP
no sh no sh
exit exit
exit exit
@ -74,7 +74,7 @@ We can use [netmiko_sendchange.py](Networking/netmiko_sendchange.py) to achieve
![](Images/Day27_Networking7.png) ![](Images/Day27_Networking7.png)
Now for those that look at the code, you will see the message appears and tells us `sending configuration to device` but there is no confirmation that this has happened to we could add additional code to our script to perform that check and validation on our switch or we could modify our script before to show us this. [netmiko_con_multi_vlan.py](Networking/netmiko_con_multi_vlan.py) Now for those that look at the code, you will see the message appears and tells us `sending configuration to device` but there is no confirmation that this has happened we could add additional code to our script to perform that check and validation on our switch or we could modify our script before to show us this. [netmiko_con_multi_vlan.py](Networking/netmiko_con_multi_vlan.py)
![](Images/Day27_Networking8.png) ![](Images/Day27_Networking8.png)

View File

@ -2,14 +2,14 @@
title: '#90DaysOfDevOps - The Big Picture: DevOps & The Cloud - Day 28' title: '#90DaysOfDevOps - The Big Picture: DevOps & The Cloud - Day 28'
published: false published: false
description: 90DaysOfDevOps - The Big Picture DevOps & The Cloud description: 90DaysOfDevOps - The Big Picture DevOps & The Cloud
tags: "devops, 90daysofdevops, learning" tags: 'devops, 90daysofdevops, learning'
cover_image: null cover_image: null
canonical_url: null canonical_url: null
id: 1048737 id: 1048737
--- ---
## The Big Picture: DevOps & The Cloud ## The Big Picture: DevOps & The Cloud
When it comes to cloud computing and what is offered, it goes very nicely with the DevOps ethos and processes. We can think of Cloud Computing bringing the technology and services whilst DevOps as we have mentioned many times before is about the process and process improvement. When it comes to cloud computing and what is offered, it goes very nicely with the DevOps ethos and processes. We can think of Cloud Computing as bringing the technology and services whilst DevOps as we have mentioned many times before is about the process and process improvement.
But to start with that cloud learning journey is a steep one and making sure you know and understand all elements or the best service to choose for the right price point is confusing. But to start with that cloud learning journey is a steep one and making sure you know and understand all elements or the best service to choose for the right price point is confusing.
@ -21,7 +21,7 @@ If we look at what we mean by the Public Cloud at a 40,000ft view, it is about r
![](Images/Day28_Cloud2.png) ![](Images/Day28_Cloud2.png)
In this first section, I want to get into and describe a little more of what Public Cloud is and some of the building blocks that get referred to as the Public Cloud overall. In this first section, I want to get into and describe a little more of what a Public Cloud is and some of the building blocks that get referred to as the Public Cloud overall.
### SaaS ### SaaS
@ -31,9 +31,9 @@ Oh, and you would also have to make sure you were backing up your data, although
What SaaS does and in particular Microsoft 365, because I mentioned Exchange is removing that administration overhead and they provide a service that delivers your exchange functionality by way of mail but also much other productivity (Office 365) and storage options (OneDrive) that overall gives a great experience to the end-user. What SaaS does and in particular Microsoft 365, because I mentioned Exchange is removing that administration overhead and they provide a service that delivers your exchange functionality by way of mail but also much other productivity (Office 365) and storage options (OneDrive) that overall gives a great experience to the end-user.
Other SaaS applications are widely adopted, such as Salesforce, SAP, Oracle, Google, Apple. All removing that burden of having to manage more of the stack. Other SaaS applications are widely adopted, such as Salesforce, SAP, Oracle, Google, and Apple. All removing that burden of having to manage more of the stack.
I am sure there is a story with DevOps and SaaS-based applications but I am struggling to find out what they may be. I know Azure DevOps has some great integrations with Microsoft 365 that I might have a look into and report back. I am sure there is a story with DevOps and SaaS-based applications but I am struggling to find out what they may be. I know Azure DevOps has some great integrations with Microsoft 365 that I might have a look into and report back to.
![](Images/Day28_Cloud3.png) ![](Images/Day28_Cloud3.png)
@ -48,15 +48,15 @@ Some will also see the public cloud as a much wider offering that includes those
![](Images/Day28_Cloud5.png) ![](Images/Day28_Cloud5.png)
*thousands more companies could land on this, I am merely picking from local, regional, telco and global brands I have worked with and am aware of.* *thousands more companies could land on this, I am merely picking from local, regional, telco and global brands I have worked with and am aware of.*
We mentioned in the SaaS section that Cloud removed the responsibility or the burden of having to administer parts of a system. If SaaS we see a lot of the abstraction layers removed i.e the physical systems, network, storage, operating system, even application to some degree. When it comes to the cloud there are various levels of abstraction we can remove or keep depending on your requirements. We mentioned in the SaaS section that Cloud removed the responsibility or the burden of having to administer parts of a system. If SaaS we see a lot of the abstraction layers removed i.e the physical systems, network, storage, operating system, and even application to some degree. When it comes to the cloud there are various levels of abstraction we can remove or keep depending on your requirements.
We have already mentioned SaaS but there are at least two more to mention regarding the public cloud. We have already mentioned SaaS but there are at least two more to mention regarding the public cloud.
Infrastructure as a service - You can think of this layer as a virtual machine but whereas on-premises you will be having to look after the physical layer in the cloud this is not the case, the physical is the cloud providers responsibility and you will manage and administer the Operating System, the data and the applications you wish to run. Infrastructure as a service - You can think of this layer as a virtual machine but whereas on-premises you will be having to look after the physical layer in the cloud this is not the case, the physical is the cloud provider's responsibility and you will manage and administer the Operating System, the data and the applications you wish to run.
Platform as a service - This continues to remove the responsibility of layers and this is really about you taking control of the data and the application but not having to worry about the underpinning hardware or operating system. Platform as a service - This continues to remove the responsibility of layers and this is really about you taking control of the data and the application but not having to worry about the underpinning hardware or operating system.
There are many other aaS offerings out there but these are the two fundamentals. You might see offerings around StaaS (Storage as a service) which provides you with your storage layer but without having to worry about the hardware underneath. Or you might have heard CaaS for Containers as a service which we will get onto, later on, another aaS we will look to cover over the next 7 days is FaaS (Functions as a Service) where maybe you do not need a running system up all the time and you just want a function to be executed as and when. There are many other aaS offerings out there but these are the two fundamentals. You might see offerings around StaaS (Storage as a service) which provide you with your storage layer but without having to worry about the hardware underneath. Or you might have heard CaaS for Containers as a service which we will get onto, later on, another aaS we will look to cover over the next 7 days is FaaS (Functions as a Service) where maybe you do not need a running system up all the time and you just want a function to be executed as and when.
There are many ways in which the public cloud can provide abstraction layers of control that you wish to pass up and pay for. There are many ways in which the public cloud can provide abstraction layers of control that you wish to pass up and pay for.

View File

@ -2,7 +2,7 @@
title: '#90DaysOfDevOps - Microsoft Azure Fundamentals - Day 29' title: '#90DaysOfDevOps - Microsoft Azure Fundamentals - Day 29'
published: false published: false
description: 90DaysOfDevOps - Microsoft Azure Fundamentals description: 90DaysOfDevOps - Microsoft Azure Fundamentals
tags: "devops, 90daysofdevops, learning" tags: 'devops, 90daysofdevops, learning'
cover_image: null cover_image: null
canonical_url: null canonical_url: null
id: 1048705 id: 1048705
@ -71,7 +71,7 @@ A subscription can be seen as a boundary between different subscriptions potenti
### Management Groups ### Management Groups
Management groups give us the ability to segregate control across our Azure AD or our tenant environment. Management groups allow us to control policies, RBAC, and budgets. Management groups give us the ability to segregate control across our Azure Active Directory (AD) or our tenant environment. Management groups allow us to control policies, Role Based Access Control (RBAC), and budgets.
Subscriptions belong to these management groups so you could have many subscriptions in your Azure AD Tenant, these subscriptions then can also control policies, RBAC, and budgets. Subscriptions belong to these management groups so you could have many subscriptions in your Azure AD Tenant, these subscriptions then can also control policies, RBAC, and budgets.

View File

@ -9,11 +9,11 @@ id: 1049039
--- ---
## Microsoft Azure Security Models ## Microsoft Azure Security Models
Following on from the Microsoft Azure Overview, we are going to start with Azure Security and see where this can help in our day today. For the most part, I have found the built-in roles have been sufficient but knowing that we can create and work with many different areas of authentication and configurations. I have found Microsoft Azure to be quite advanced with its Active Directory background compared to other public clouds. Following on from the Microsoft Azure Overview, we are going to start with Azure Security and see where this can help in our day to day. For the most part, I have found the built-in roles have been sufficient but knowing that we can create and work with many different areas of authentication and configurations. I have found Microsoft Azure to be quite advanced with its Active Directory background compared to other public clouds.
## Microsoft Azure Security Models ## Microsoft Azure Security Models
This is one area that Microsoft Azure seemingly works differently from other public cloud providers, in Azure there is ALWAYS Azure AD. This is one area in which Microsoft Azure seemingly works differently from other public cloud providers, in Azure there is ALWAYS Azure AD.
### Directory Services ### Directory Services
@ -104,7 +104,7 @@ We can also use the check access tab if we want to check an account against this
- Free tier includes continuous assessment and security recommendations. - Free tier includes continuous assessment and security recommendations.
- Paid plans for protected resources types (e.g. Servers, AppService, SQL, Storage, Containers, KeyVault). - Paid plans for protected resource types (e.g. Servers, AppService, SQL, Storage, Containers, KeyVault).
I have switched to another subscription to view the Azure Security Center and you can see here based on very few resources that I have some recommendations in one place. I have switched to another subscription to view the Azure Security Center and you can see here based on very few resources that I have some recommendations in one place.
@ -112,7 +112,7 @@ I have switched to another subscription to view the Azure Security Center and yo
### Azure Policy ### Azure Policy
- Azure Policy is an Azure native service that helps to enforce organizational standards and assess compliance at-scale. - Azure Policy is an Azure native service that helps to enforce organizational standards and assess compliance at scale.
- Integrated into Microsoft Defender for Cloud. Azure Policy audits non-compliant resources and applies remediation. - Integrated into Microsoft Defender for Cloud. Azure Policy audits non-compliant resources and applies remediation.
@ -120,7 +120,7 @@ I have switched to another subscription to view the Azure Security Center and yo
- Uses JSON format to store evaluation logic and determine whether a resource is compliant or not, and any actions to take for non-compliance (e.g. Audit, AuditIfNotExists, Deny, Modify, DeployIfNotExists). - Uses JSON format to store evaluation logic and determine whether a resource is compliant or not, and any actions to take for non-compliance (e.g. Audit, AuditIfNotExists, Deny, Modify, DeployIfNotExists).
- Free for use. The exception being Azure Arc connected resources charged per server/month for Azure Policy Guest Configuration usage. - Free for use. The exception is Azure Arc connected resources charged per server/month for Azure Policy Guest Configuration usage.
### Hands-On ### Hands-On
@ -128,15 +128,15 @@ I have gone out and I have purchased www.90DaysOfDevOps.com and I would like to
![](Images/Day30_Cloud9.png) ![](Images/Day30_Cloud9.png)
With that now we can create a new user on our new Active Directory Domain. With that now, we can create a new user on our new Active Directory Domain.
![](Images/Day30_Cloud10.png) ![](Images/Day30_Cloud10.png)
Now we want to create a group for all of our new 90DaysOfDevOps users in one group. We can create a group as per the below, notice that I am using "Dynamic User" this means Azure AD will query user accounts and add them dynamically vs assigned which is where you manually add the user to your group. Now we want to create a group for all of our new 90DaysOfDevOps users in one group. We can create a group as per the below, notice that I am using "Dynamic User" which means Azure AD will query user accounts and add them dynamically vs assigned which is where you manually add the user to your group.
![](Images/Day30_Cloud11.png) ![](Images/Day30_Cloud11.png)
There are lots of options when it comes to creating your query, my plan is to simply find the principal name and make sure that the name contains @90DaysOfDevOps.com. There are lots of options when it comes to creating your query, I plan to simply find the principal name and make sure that the name contains @90DaysOfDevOps.com.
![](Images/Day30_Cloud12.png) ![](Images/Day30_Cloud12.png)
@ -148,7 +148,7 @@ I have since added a new user1@90DaysOfDevOps.com and if we go and check the gro
![](Images/Day30_Cloud14.png) ![](Images/Day30_Cloud14.png)
If we have this requirement x100 then we are not going to want to do this all in the console we are going to want to take advantage of either bulk options to create, invite, delete users or you are going to want to look into PowerShell to achieve this automated approach to scale. If we have this requirement x100 then we are not going to want to do this all in the console we are going to want to take advantage of either bulk options to create, invite, and delete users or you are going to want to look into PowerShell to achieve this automated approach to scale.
Now we can go to our Resource Group and specify that on the 90DaysOfDevOps resource group we want the owner to be the group we just created. Now we can go to our Resource Group and specify that on the 90DaysOfDevOps resource group we want the owner to be the group we just created.
@ -156,15 +156,15 @@ Now we can go to our Resource Group and specify that on the 90DaysOfDevOps resou
We can equally go in here and deny assignments access to our resource group as well. We can equally go in here and deny assignments access to our resource group as well.
Now if we login to the Azure Portal with our new user account, you can see that we only have access to our 90DaysOfDevOps resource group and not the others seen in previous pictures because we do not have the access. Now if we log in to the Azure Portal with our new user account, you can see that we only have access to our 90DaysOfDevOps resource group and not the others seen in previous pictures because we do not have the access.
![](Images/Day30_Cloud16.png) ![](Images/Day30_Cloud16.png)
The above is great if this is a user that has access to resources inside of your Azure portal but not every user needs to be aware of the portal, but in order to check access we can use the [Apps Portal](https://myapps.microsoft.com/) This is a single sign on portal for us to test. The above is great if this is a user that has access to resources inside of your Azure portal, not every user needs to be aware of the portal, but to check access we can use the [Apps Portal](https://myapps.microsoft.com/) This is a single sign-on portal for us to test.
![](Images/Day30_Cloud17.png) ![](Images/Day30_Cloud17.png)
You are able to customise this portal with your own branding and this might be something we come back to later on. You can customise this portal with your branding and this might be something we come back to later on.
## Resources ## Resources

View File

@ -2,7 +2,7 @@
title: '#90DaysOfDevOps - Microsoft Azure Storage Models - Day 32' title: '#90DaysOfDevOps - Microsoft Azure Storage Models - Day 32'
published: false published: false
description: 90DaysOfDevOps - Microsoft Azure Storage Models description: 90DaysOfDevOps - Microsoft Azure Storage Models
tags: "devops, 90daysofdevops, learning" tags: 'devops, 90daysofdevops, learning'
cover_image: null cover_image: null
canonical_url: null canonical_url: null
id: 1048775 id: 1048775
@ -72,6 +72,8 @@ When it comes to storage performance we have two different types:
- **Standard** - Maximum number of IOPS - **Standard** - Maximum number of IOPS
- **Premium** - Guaranteed number of IOPS - **Premium** - Guaranteed number of IOPS
IOPS => Input/Output operations per sec.
There is also a difference between unmanaged and managed disks to consider when choosing the right storage for the task you have. There is also a difference between unmanaged and managed disks to consider when choosing the right storage for the task you have.
### Virtual Machine Storage ### Virtual Machine Storage

View File

@ -2,14 +2,14 @@
title: '#90DaysOfDevOps - Installing & Configuring Git - Day 36' title: '#90DaysOfDevOps - Installing & Configuring Git - Day 36'
published: false published: false
description: 90DaysOfDevOps - Installing & Configuring Git description: 90DaysOfDevOps - Installing & Configuring Git
tags: "devops, 90daysofdevops, learning" tags: 'devops, 90daysofdevops, learning'
cover_image: null cover_image: null
canonical_url: null canonical_url: null
id: 1048738 id: 1048738
--- ---
## Installing & Configuring Git ## Installing & Configuring Git
Git is a open source, cross platform tool for version control. If I like me you are using Ubuntu or most Linux environments you might find that you already git installed but we are going to run through the install and configuration. Git is a open source, cross platform tool for version control. If you are like me, using Ubuntu or most Linux environments you might find that you already have git installed but we are going to run through the install and configuration.
Even if you already have git installed on your system it is also a good idea to make sure we are up to date. Even if you already have git installed on your system it is also a good idea to make sure we are up to date.
@ -136,7 +136,7 @@ Some of the major benefits of Git are:
- Flexible - Flexible
- Safe & Secure - Safe & Secure
Unlike the Client-Server version control model, each developer downloads the the source repository meaning everything. History of commits, all the branches etc. etc. Unlike the Client-Server version control model, each developer downloads the source repository meaning everything. History of commits, all the branches etc.
![](Images/Day36_Git16.png) ![](Images/Day36_Git16.png)

View File

@ -132,5 +132,6 @@ A container image is a lightweight, standalone, executable package of software t
- [TechWorld with Nana - Docker Tutorial for Beginners](https://www.youtube.com/watch?v=3c-iBn73dDE) - [TechWorld with Nana - Docker Tutorial for Beginners](https://www.youtube.com/watch?v=3c-iBn73dDE)
- [Programming with Mosh - Docker Tutorial for Beginners](https://www.youtube.com/watch?v=pTFZFxd4hOI) - [Programming with Mosh - Docker Tutorial for Beginners](https://www.youtube.com/watch?v=pTFZFxd4hOI)
- [Docker Tutorial for Beginners - What is Docker? Introduction to Containers](https://www.youtube.com/watch?v=17Bl31rlnRM&list=WL&index=128&t=61s) - [Docker Tutorial for Beginners - What is Docker? Introduction to Containers](https://www.youtube.com/watch?v=17Bl31rlnRM&list=WL&index=128&t=61s)
- [Introduction to Container By Red Hat](https://www.redhat.com/en/topics/containers)
See you on [Day 43](day43.md) See you on [Day 43](day43.md)

View File

@ -2,7 +2,7 @@
title: '#90DaysOfDevOps - Docker Images & Hands-On with Docker Desktop - Day 44' title: '#90DaysOfDevOps - Docker Images & Hands-On with Docker Desktop - Day 44'
published: false published: false
description: 90DaysOfDevOps - Docker Images & Hands-On with Docker Desktop description: 90DaysOfDevOps - Docker Images & Hands-On with Docker Desktop
tags: "devops, 90daysofdevops, learning" tags: 'devops, 90daysofdevops, learning'
cover_image: null cover_image: null
canonical_url: null canonical_url: null
id: 1048708 id: 1048708
@ -23,7 +23,7 @@ If you scroll down once logged in you are going to see a list of container image
![](Images/Day44_Containers2.png) ![](Images/Day44_Containers2.png)
We can drill deeper into the view of available images and search across categories, operating systems and architectures. The one thing I highlight below is the Office Image, this should give you peace of mind of the origin of this container image. We can drill deeper into the view of available images and search across categories, operating systems and architectures. The one thing I highlight below is the Official Image, this should give you peace of mind of the origin of this container image.
![](Images/Day44_Containers3.png) ![](Images/Day44_Containers3.png)

View File

@ -2,7 +2,7 @@
title: '#90DaysOfDevOps - The anatomy of a Docker Image - Day 45' title: '#90DaysOfDevOps - The anatomy of a Docker Image - Day 45'
published: false published: false
description: 90DaysOfDevOps - The anatomy of a Docker Image description: 90DaysOfDevOps - The anatomy of a Docker Image
tags: "devops, 90daysofdevops, learning" tags: 'devops, 90daysofdevops, learning'
cover_image: null cover_image: null
canonical_url: null canonical_url: null
id: 1048777 id: 1048777
@ -25,7 +25,7 @@ We should organise our layers that change most often as high in the stack as pos
Each time docker launches a container from an image (like we ran yesterday) it adds a writeable layer, known as the container layer. This stores all changes to the container throughout its runtime. This layer is the only difference between a live operational container and the source image itself. Any number of like for like containers can share access to the same underlying image while maintaining their own individual state. Each time docker launches a container from an image (like we ran yesterday) it adds a writeable layer, known as the container layer. This stores all changes to the container throughout its runtime. This layer is the only difference between a live operational container and the source image itself. Any number of like for like containers can share access to the same underlying image while maintaining their own individual state.
Back to the example we used yesterday with the Ubuntu image. We could run that same command multiple times and on the first container we could go and install pinta and on the second we could install figlet two different applications, different purpose, different size etc etc. Each container that we deployed share the same image but not the same state and then that state is then gone when we remove the container. Back to the example we used yesterday with the Ubuntu image. We could run that same command multiple times and on the first container we could go and install pinta and on the second we could install figlet two different applications, different purpose, different size etc. Each container that we deployed share the same image but not the same state and then that state is then gone when we remove the container.
![](Images/Day45_Containers1.png) ![](Images/Day45_Containers1.png)
@ -59,7 +59,7 @@ The following table shows some of the dockerfile statements we will be using or
| COPY | To copy over files or directories from a specific location. | | COPY | To copy over files or directories from a specific location. |
| ADD | As COPY, but also able to handle remote URLs and unpack compressed files. | | ADD | As COPY, but also able to handle remote URLs and unpack compressed files. |
| ENTRYPOINT | Command that will always be executed when the container starts. If not specified, the default is /bin/sh -c | | ENTRYPOINT | Command that will always be executed when the container starts. If not specified, the default is /bin/sh -c |
| .md | Arguments passed to the entrypoint. If ENTRYPOINT is not set (defaults to /bin/sh -c), the .mdwill be the commands the container executes. | | CMD | Arguments passed to the entrypoint. If ENTRYPOINT is not set (defaults to /bin/sh -c), the CMD will be the commands the container executes. |
| EXPOSE | To define which port through which to access your container application. | | EXPOSE | To define which port through which to access your container application. |
| LABEL | To add metadata to the image. | | LABEL | To add metadata to the image. |

View File

@ -29,17 +29,17 @@ Important things to note from the above qoute, Kubernetes is Open-Source with a
I mentioned above that containers are great and in the previous section we spoke about how containers and container images have changed and accelerated the adoption of cloud-native systems. But containers alone are not going to give you the production ready experience you need from your application. Kubernetes gives us the following: I mentioned above that containers are great and in the previous section we spoke about how containers and container images have changed and accelerated the adoption of cloud-native systems. But containers alone are not going to give you the production ready experience you need from your application. Kubernetes gives us the following:
- Service discovery and load balancing Kubernetes can expose a container using the DNS name or using their own IP address. If traffic to a container is high, Kubernetes is able to load balance and distribute the network traffic so that the deployment is stable. - **Service discovery and load balancing** Kubernetes can expose a container using the DNS name or using their own IP address. If traffic to a container is high, Kubernetes is able to load balance and distribute the network traffic so that the deployment is stable.
- Storage orchestration Kubernetes allows you to automatically mount a storage system of your choice, such as local storages, public cloud providers, and more. - **Storage orchestration** Kubernetes allows you to automatically mount a storage system of your choice, such as local storages, public cloud providers, and more.
- Automated rollouts and rollbacks You can describe the desired state for your deployed containers using Kubernetes, and it can change the actual state to the desired state at a controlled rate. For example, you can automate Kubernetes to create new containers for your deployment, remove existing containers and adopt all their resources to the new container. - **Automated rollouts and rollbacks** You can describe the desired state for your deployed containers using Kubernetes, and it can change the actual state to the desired state at a controlled rate. For example, you can automate Kubernetes to create new containers for your deployment, remove existing containers and adopt all their resources to the new container.
- Automatic bin packing You provide Kubernetes with a cluster of nodes that it can use to run containerized tasks. You tell Kubernetes how much CPU and memory (RAM) each container needs. Kubernetes can fit containers onto your nodes to make the best use of your resources. - **Automatic bin packing** You provide Kubernetes with a cluster of nodes that it can use to run containerized tasks. You tell Kubernetes how much CPU and memory (RAM) each container needs. Kubernetes can fit containers onto your nodes to make the best use of your resources.
- Self-healing Kubernetes restarts containers that fail, replaces containers, kills containers that don't respond to your user-defined health check, and doesn't advertise them to clients until they are ready to serve. - **Self-healing** Kubernetes restarts containers that fail, replaces containers, kills containers that don't respond to your user-defined health check, and doesn't advertise them to clients until they are ready to serve.
- Secret and configuration management Kubernetes lets you store and manage sensitive information, such as passwords, OAuth tokens, and SSH keys. You can deploy and update secrets and application configuration without rebuilding your container images, and without exposing secrets in your stack configuration. - **Secret and configuration management** Kubernetes lets you store and manage sensitive information, such as passwords, OAuth tokens, and SSH keys. You can deploy and update secrets and application configuration without rebuilding your container images, and without exposing secrets in your stack configuration.
Kubernetes provides you with a framework to run distributed systems resiliently. Kubernetes provides you with a framework to run distributed systems resiliently.
@ -185,7 +185,6 @@ A Pod is a group of containers that form a logical application. For e.g. If you
- Each Pod has a unique, persistent identifier that the controller maintains over any rescheduling. - Each Pod has a unique, persistent identifier that the controller maintains over any rescheduling.
- Each Pod has a unique, persistent identifier that the controller maintains over any rescheduling.
![](Images/Day49_Kubernetes10.png) ![](Images/Day49_Kubernetes10.png)

View File

@ -2,7 +2,7 @@
title: '#90DaysOfDevOps - Deploying your first Kubernetes Cluster - Day 51' title: '#90DaysOfDevOps - Deploying your first Kubernetes Cluster - Day 51'
published: false published: false
description: 90DaysOfDevOps - Deploying your first Kubernetes Cluster description: 90DaysOfDevOps - Deploying your first Kubernetes Cluster
tags: "devops, 90daysofdevops, learning" tags: 'devops, 90daysofdevops, learning'
cover_image: null cover_image: null
canonical_url: null canonical_url: null
id: 1048778 id: 1048778

View File

@ -2,7 +2,7 @@
title: '#90DaysOfDevOps - Rancher Overview - Hands On - Day 53' title: '#90DaysOfDevOps - Rancher Overview - Hands On - Day 53'
published: false published: false
description: 90DaysOfDevOps - Rancher Overview - Hands On description: 90DaysOfDevOps - Rancher Overview - Hands On
tags: "devops, 90daysofdevops, learning" tags: 'devops, 90daysofdevops, learning'
cover_image: null cover_image: null
canonical_url: null canonical_url: null
id: 1048742 id: 1048742

View File

@ -2,11 +2,12 @@
title: '#90DaysOfDevOps - The Big Picture: CI/CD Pipelines - Day 70' title: '#90DaysOfDevOps - The Big Picture: CI/CD Pipelines - Day 70'
published: false published: false
description: 90DaysOfDevOps - The Big Picture CI/CD Pipelines description: 90DaysOfDevOps - The Big Picture CI/CD Pipelines
tags: "devops, 90daysofdevops, learning" tags: 'devops, 90daysofdevops, learning'
cover_image: null cover_image: null
canonical_url: null canonical_url: null
id: 1048836 id: 1048836
--- ---
## The Big Picture: CI/CD Pipelines ## The Big Picture: CI/CD Pipelines
A CI/CD (Continous Integration/Continous Deployment) Pipeline implementation is the backbone of the modern DevOps environment. A CI/CD (Continous Integration/Continous Deployment) Pipeline implementation is the backbone of the modern DevOps environment.
@ -110,6 +111,7 @@ My plan is to look at the following:
## Resources ## Resources
- [Jenkins is the way to build, test, deploy](https://youtu.be/_MXtbjwsz3A) - [Jenkins is the way to build, test, deploy](https://youtu.be/_MXtbjwsz3A)
- [Introduction to Jenkins](https://www.edx.org/course/introduction-to-jenkins)
- [Jenkins.io](https://www.jenkins.io/) - [Jenkins.io](https://www.jenkins.io/)
- [ArgoCD](https://argo-cd.readthedocs.io/en/stable/) - [ArgoCD](https://argo-cd.readthedocs.io/en/stable/)
- [ArgoCD Tutorial for Beginners](https://www.youtube.com/watch?v=MeU5_k9ssrs) - [ArgoCD Tutorial for Beginners](https://www.youtube.com/watch?v=MeU5_k9ssrs)

View File

@ -1,5 +1,5 @@
--- ---
title: '#90DaysOfDevOps - はじめに - 日目' title: '#90DaysOfDevOps - はじめに - 1日目'
published: true published: true
description: 90DaysOfDevOps - はじめに description: 90DaysOfDevOps - はじめに
tags: 'devops, 90daysofdevops, learning' tags: 'devops, 90daysofdevops, learning'
@ -48,7 +48,7 @@ DevOpsは、このムーブメントの目標に到達するための一連の
DevOpsの観点では、**開発、テスト、デプロイメント**のすべてがDevOpsチームに集約されます。 DevOpsの観点では、**開発、テスト、デプロイメント**のすべてがDevOpsチームに集約されます。
最後に、これを可能な限り効果的かつ効率的にするために、**自動化**を活用しなければならないことを指摘したいと思います。 無料版のDeepL翻訳www.DeepL.com/Translatorで翻訳しました。 最後に、これを可能な限り効果的かつ効率的にするために、**自動化**を活用しなければならないことを指摘したいと思います。
## リソース ## リソース

View File

@ -1,68 +1,67 @@
--- ---
title: '#90DaysOfDevOps - Responsibilities of a DevOps Engineer - Day 2' title: '#90DaysOfDevOps - DevOpsエンジニアの責務 - 2日目'
published: false published: false
description: 90DaysOfDevOps - Responsibilities of a DevOps Engineer description: 90DaysOfDevOps - DevOpsエンジニアの責務
tags: 'devops, 90daysofdevops, learning' tags: 'devops, 90daysofdevops, learning'
cover_image: null cover_image: null
canonical_url: null canonical_url: null
id: 1048699 id: 1048699
date: '2022-04-17T21:15:34Z'
--- ---
## Responsibilities of a DevOps Engineer ## DevOpsエンジニアの責務
Hopefully you are coming into this off the back of going through the resources and post on [Day1 of #90DaysOfDevOps](day01.md) 希望的には、[#90DaysOfDevOps の一日目](day01.md)のリソースと投稿に目を通した後に、本稿を読んでほしい。
It was briefly touched on in the first post but now we must get deeper into this concept and understand that there are two main parts when creating an application. We have the **Development** part where software developers program the application and test it. Then we have the **Operations** part where the application is deployed and maintained on a server. 最初の投稿で簡単に触れましたが、このコンセプトをより深く理解し、アプリケーションを作成する際に2つの主要な部分があることを理解する必要があります。まず、ソフトウェア開発者がアプリケーションをプログラムし、それをテストする**開発**の部分があります。そして、アプリケーションをサーバーにデプロイし、維持する**運用**の部分があります。
## DevOps is the link between the two ## DevOps は両者をつなぐもの
To get to grips with DevOps or the tasks in which a DevOps engineer would be carrying out we need to understand the tools or the process and overview of those and how they come together. DevOpsやDevOpsエンジニアが行う作業を理解するためには、ツールやプロセス、それらの概要、そしてそれらがどのように組み合わされているかを理解する必要があります。
Everything starts with the application! You will see so much throughout that it is all about the application when it comes to DevOps. すべてはアプリケーションから始まる DevOpsに関しては、アプリケーションがすべてであることがよくわかると思います。
Developers will create an application, this can be done with many different technology stacks and lets leave that to the imagination for now as we get into this later. This can also involve many different programming languages, build tools, code repository etc. 開発者はアプリケーションを作成します。これは様々なテクノロジースタックを使って行うことができますが、これについては後で説明するので、今は想像にお任せします。また、多くの異なるプログラミング言語、ビルドツール、コードリポジトリなどが含まれることもあります。
As a DevOps engineer you won't be programming the application but having a good understanding of the concepts of how a developer works and the systems, tools and processes they are using is key to success. DevOpsエンジニアとして、あなたはアプリケーションをプログラミングすることはありませんが、開発者がどのように働いているか、彼らが使っているシステム、ツール、プロセスの概念をよく理解していることが成功への鍵となります。
At a very high level you are going to need to know how the application is configured to talk to all of its required services or data services and then also sprinkle a requirement of how this can or should be tested. 非常に高いレベルでは、アプリケーションが必要なすべてのサービスやデータサービスと対話するためにどのように構成されるかを知る必要がありますし、これをどのようにテストできるか、またはすべきかの要件も必要です。
The application will need to be deployed somewhere, lets keep it generally simple here and make this a server, doesn't matter where but a server. This is then expected to be accessed by the customer or end user depending on the application that has been created. アプリケーションは、どこかに配置される必要があります。ここでは、一般的にシンプルにして、サーバーとします。このアプリケーションは、作成されたアプリケーションに応じて、顧客やエンドユーザがアクセスすることが期待されます。
This server needs to run somewhere, on-premises, in a public cloud, serverless (Ok I have gone too far, we won't be covering serverless but its an option and more and more enterprises are heading this way) Someone needs to create and configure these servers and get them ready for the application to run. Now this element might land to you as a DevOps engineer to deploy and configure these servers. このサーバーは、オンプレミス、パブリッククラウド、サーバーレスここではサーバーレスについては触れませんが、選択肢の一つであり、ますます多くの企業がこの方向に向かっていますなど、どこかで実行される必要があります。この要素は、DevOpsエンジニアとして、これらのサーバーをデプロイし、設定するために、あなたに降りかかるかもしれません。
These servers will have to run an Operating System and generally speaking this is going to be Linux but we have a whole section or week where we cover some of the foundational knowledge you should gain here. これらのサーバーはオペレーティング・システムを実行する必要があり、一般的に言えば、これはLinuxになるのですが、このセクションまたは週で、ここで得るべき基礎知識をいくつか取り上げます。
It is also likely that we need to communicate with other services in our network or environment, so we also need to have that level of knowledge around networking and configuring that, this might to some degree also land at the feet of the DevOps engineer. Again we will cover this in more detail in a dedicated section talking all things DNS, DHCP, Load Balancing etc. また、ネットワークや環境内の他のサービスと通信する必要がある場合もあるので、ネットワーキングやその設定に関する知識も必要です。ここでもまた、DNS、DHCP、ロードバランシングなどに関する専門的なセクションでより詳しく説明します。
## Jack of all trades, Master of none ## なんでも屋
I will say at this point though, you don't need to be a Network or Infrastructure specialist you need a foundational knowledge of how to get things up and running and talking to each other, much the same as maybe having a foundational knowledge of a programming language but you don't need to be a developer. However you might be coming into this as a specialist in an area and that is a great footing to adapt to other areas. ネットワークやインフラの専門家である必要はありませんが、プログラミング言語の基礎知識は必要ですが、開発者である必要はありません。しかし、ある分野のスペシャリストとして入社すれば、他の分野に適応するための大きな足がかりになるでしょう。
You will also most likely not take over the management of these servers or the application on a daily basis. また、サーバーやアプリケーションの管理を日常的に行うことはほとんどないでしょう。
We have been talking about servers but the likelihood is that your application will be developed to run as containers, Which still runs on a server for the most part but you will also need an understanding of not only virtualisation, Cloud Infrastructure as a Service (IaaS) but also containerisation as well, The focus in these 90 days will be more catered towards containers. これまでサーバーについて説明してきましたが、アプリケーションはコンテナとして動作するように開発される可能性が高いです。それでも大部分はサーバ上で動作しますが、仮想化、IaaS (Cloud Infrastructure as a Service) だけでなく、コンテナ化についても理解する必要があります。
## High Level Overview ## ハイレベルの概要
On one side we have our developers creating new features and functionality (as well as bug fixes) for the application. 一方では、開発者がアプリケーションの新機能(およびバグフィックス)を作成しています。
On the other side we have some sort of environment, infrastructure or servers which are configured and managed to run this application and communicate with all its required services. もう一方では、このアプリケーションを実行し、必要なすべてのサービスと通信するために構成され管理される、ある種の環境、インフラストラクチャ、またはサーバーがあります。
The big question is how do we get those features and bug fixes into our production and make it available to those end users? 大きな問題は、これらの機能やバグフィックスをどのように本番環境に導入し、エンドユーザーが利用できるようにするかということです。
How do we release the new application version? This is one of the main tasks for a DevOps engineer, and the important thing here is not to just figure out how to do this once but we need to do this continuously and in an automated, efficient way which also needs to include testing! 新しいアプリケーションのバージョンをどのようにリリースするのでしょうかこれはDevOpsエンジニアの主なタスクの1つで、ここで重要なのは、これを1回だけ行う方法を見つけ出すことではなく、継続的に、自動化された効率的な方法で行う必要があり、テストも含める必要があります。
This is where we are going to end this day of learning, hopefully this was useful. Over the next few days we are going to dive a little deeper into some more areas of DevOps and then we will get into the sections that dive deeper into the tooling and processes and the benefits of these. ここで、今日の学習を終了します。これが役に立つことを願っています。次の数日間で、DevOpsのいくつかの領域をもう少し深く掘り下げていきます。そして、ツールやプロセス、これらの利点について深く掘り下げたセクションに入ります。
## Resources ## 参考情報
I am always open to adding additional resources to these readme files as it is here as a learning tool. このReadmeファイルは学習用として用意したものなので、いつでも追加リソースを受け付けています。
私のアドバイスは、以下のビデオをすべて見ること、そして上記のテキストや説明から何かを感じ取っていただけることを願っています。
My advice is to watch all of the below and hopefully you also picked something up from the text and explanations above.
- [What is DevOps? - TechWorld with Nana](https://www.youtube.com/watch?v=0yWAtQ6wYNM) - [What is DevOps? - TechWorld with Nana](https://www.youtube.com/watch?v=0yWAtQ6wYNM)
- [What is DevOps? - GitHub YouTube](https://www.youtube.com/watch?v=kBV8gPVZNEE) - [What is DevOps? - GitHub YouTube](https://www.youtube.com/watch?v=kBV8gPVZNEE)
- [What is DevOps? - IBM YouTube](https://www.youtube.com/watch?v=UbtB4sMaaNM) - [What is DevOps? - IBM YouTube](https://www.youtube.com/watch?v=UbtB4sMaaNM)
- [What is DevOps? - AWS ](https://aws.amazon.com/devops/what-is-devops/) - [What is DevOps? - AWS ](https://aws.amazon.com/devops/what-is-devops/)
- [What is DevOps? - Microsoft](https://docs.microsoft.com/en-us/devops/what-is-devops) - [What is DevOps? - Microsoft](https://docs.microsoft.com/en-us/devops/what-is-devops)
If you made it this far then you will know if this is where you want to be or not. See you on [Day 3](day03.md). ここまで来れば、ここが自分の望むところかどうかが分かるはずです。それでは、[3日目](day03.md)でお会いしましょう。

View File

@ -1,77 +1,80 @@
--- ---
title: '#90DaysOfDevOps - Application Focused - Day 3' title: '#90DaysOfDevOps - アプリケーションフォーカス - 3日目'
published: false published: false
description: 90DaysOfDevOps - Application Focused description: 90DaysOfDevOps - アプリケーションフォーカス
tags: "devops, 90daysofdevops, learning" tags: "devops, 90daysofdevops, learning"
cover_image: null cover_image: null
canonical_url: null canonical_url: null
id: 1048825 id: 1048825
--- ---
## DevOps Lifecycle - Application Focused ## DevOps ライフサイクル - アプリケーションフォーカス
もしあなたがDevOpsエンジニアの役割を目指しているのであれば、繰り返し行うことに慣れるでしょうが、毎回常に向上させることは、物事を面白く保つもう1つのポイントです。
As we continue through these next few weeks we are 100% going to come across these titles (Continuous Development, Testing, Deployment, Monitor) over and over again, If you are heading towards the DevOps Engineer role then repeatability will be something you will get used to but constantly enhancing each time is another thing that keeps things interesting. この時間では、アプリケーションの開始から終了までのハイレベルなビューを見て、一定のループのように戻ってくることにします。
In this hour we are going to take a look at the high level view of the application from start to finish and then back round again like a constant loop. ### 開発
### Development 開発者として、クライアントやエンドユーザーと要件について話し合い、アプリケーションのためのある種の計画や要件を考え出さなければならないかもしれません。次に、要件から新しいアプリケーションを作成する必要があります。
Let's take a brand new example of an Application, to start with we have nothing created, maybe as a developer you have to discuss with your client or end user on the requirements and come up with some sort of plan or requirements for your Application. We then need to create from the requirements our brand new application.
In regards to tooling at this stage there is no real requirement here other than choosing your IDE and the programming language you wish to use to write your application. この段階では、IDEとアプリケーションを書くのに使いたいプログラミング言語を選択する以外には、ツールに関する本当の要件はありません。
As a DevOps engineer, remember you are probably not the one creating this plan or coding the application for the end user, this will be a skilled developer. DevOpsエンジニアとして、あなたはおそらくこの計画を作成したり、エンド・ユーザーのためにアプリケーションをコーディングしたりする人ではないことを忘れないでください。
But it also would not hurt for you to be able to read some of the code so that you can make the best infrastructure decisions moving forward for your application. しかし、アプリケーションのために前進する最適なインフラストラクチャの決定を行うことができるように、コードの一部を読むことができても問題ないでしょう。
We previously mentioned that this application can be written in any language. Importantly this should be maintained using a version control system, this is something we will cover also in detail later on and in particular we will dive into **Git**. このアプリケーションはどのような言語でも書けることは前述しました。重要なのは、バージョン管理システムを使って保守することです。これは後ほど詳しく説明し、特に **Git** について掘り下げます。
It is also likely that it will not be one developer working on this project although this could be the case but even so best practices would require a code repository to store and collaborate on the code, this could be private or public and could be hosted or privately deployed generally speaking you would hear the likes of **GitHub or GitLab** being used as a code repository. Again we will cover these as part of our section on **Git** later on. また、このプロジェクトに取り組む開発者は一人ではないと思われますが、それでもベストプラクティスでは、コードを保存して共同作業するためのコードリポジトリが必要です。これはプライベートでもパブリックでもよく、ホスティングでもプライベートでも展開できます。後ほど、**Git**のセクションの一部として、これらもカバーします。
### Testing ### テスト
At this stage we have our requirements and we have our application being developed. But we need to make sure we are testing our code in all the various different environments that we have available to us or specifically maybe to the programming language chosen.
This phase enables QA to test for bugs, more frequently we see containers being used for simulating the test environment which overall can improve on cost overheads of physical or cloud infrastructure. この段階で、私たちは要件を満たし、アプリケーションを開発することができます。しかし、私たちは、私たちが利用可能なすべての様々な異なる環境、または特に選択したプログラミング言語で私たちのコードをテストしていることを確認する必要があります。
This phase is also likely going to be automated as part of the next area which is Continuous Integration. このフェーズでは、QAがバグをテストすることができます。テスト環境のシミュレーションにコンテナが使用されることが多くなり、物理インフラやクラウドインフラのコスト・オーバーヘッドを全体的に改善することができます。
The ability to automate this testing vs 10s,100s or even 1000s of QA engineers having to do this manually speaks for itself, these engineers can focus on something else within the stack to ensure you are moving faster and developing more functionality vs testing bugs and software which tends to be the hold up on most traditional software releases that use a waterfall methodology. このフェーズは、次の継続的インテグレーションContinuous Integrationの一部として自動化される可能性も高い。
### Integration QAエンジニアが数十人、数百人、あるいは千人単位で手作業でテストを行うのに対して、このテストを自動化できることは、それを物語っています。このエンジニアはスタック内で他のことに集中できるので、ウォーターフォール手法を使用した従来のソフトウェアのリリースで滞りがちだったバグやソフトウェアのテストではなく、より速く、より機能的に開発することが可能になります。
Quite importantly Integration is at the middle of the DevOps lifecycle. It is the practice of in which developers require to commit changes to the source code more frequently. This could be on a daily or weekly basis. ### インテグレーション
With every commit your application can go through the automated testing phases and this allows for early detection of issues or bugs before the next phase. インテグレーションは、DevOpsのライフサイクルの中で最も重要なものです。これは、開発者がより頻繁にソースコードに変更をコミットすることを必要とするプラクティスである。これは、日次または週次ベースで可能です。
Now you might at this stage be saying "but we don't create applications, we buy it off the shelf from a software vendor" Don't worry many companies do this and will continue to do this and it will be the software vendor that is concentrating on the above 3 phases but you might want to still adopt the final phase as this will enable for faster and more efficient deployments of your off the shelf deployments. コミットするたびに、アプリケーションは自動テストフェーズを通過することができ、次のフェーズの前に問題やバグを早期に発見することができるようになります。
I would also suggest just having this above knowledge is very important as you might buy off the shelf software today, but what about tomorrow or down the line... next job maybe? この段階で、「でも、うちはアプリケーションを作らずに、ソフトウェアベンダーから既製品を買っています」と言うかもしれません。心配しないでください。多くの企業がそうしていますし、これからもそうするでしょう。
### Deployment また、このような知識を持つことは非常に重要です。今日は既製のソフトウェアを購入するかもしれませんが、明日やその先...次の仕事ではどうでしょうか?
Ok so we have our application built and tested against the requirements of our end user and we now need to go ahead and deploy this application into production for our end users to consume.
This is the stage where the code is deployed to the production servers, now this is where things get extremely interesting and it is where the rest of our 86 days dives deeper into these areas. Because different applications require different possibly hardware or configurations. This is where **Application Configuration Management** and **Infrastructure as Code** could play a key part in your DevOps lifecycle. It might be that your application is **Containerised** but also available to run on a virtual machine. Which then also leads us onto platforms like **Kubernetes** which would be orchestrating those containers and making sure you have the desired state available to your end users. ### デプロイメント
All of these bold topics we will go into more detail over the next few weeks to get a better foundational knowledge of what they are and when to use them. さて、私たちはアプリケーションを構築し、エンドユーザーの要求に対してテストを行いました。
### Monitoring これは、コードを本番サーバーにデプロイする段階ですが、ここが非常に面白くなるところで、86日間の残りの時間は、これらの領域に深く潜っていきます。なぜなら、アプリケーションによって、必要となるハードウェアや構成が異なるからです。そこで、**アプリケーション構成管理**と**Infrastructure as Code**がDevOpsライフサイクルの中で重要な役割を果たすことになります。アプリケーションは**コンテナ化**されているかもしれませんが、仮想マシン上で実行することも可能です。そして、コンテナをオーケストレーションし、エンドユーザーが望む状態を利用できるようにする **Kubernetes** のようなプラットフォームにもつながっていくのです。
Things are moving fast here and we have our Application that we are continuously updating with new features and functionality and we have our testing making sure no gremlins are being found. We have the application running in our environment that can be continually keeping the required configuration and performance. これらの大胆なトピックはすべて、今後数週間でさらに詳しく説明し、それらが何であり、どのような場合に使用するのかについての基礎知識を深めていきます。
But now we need to be sure that our end users are getting the experience they require. Here we need to make sure that our Application Performance is continuously being monitored, this phase is going to allow your developers to make better decisions about enhancements to the application in future releases to better serve the end users. ### モニタリング
This section is also where we are going to capture that feedback wheel about the features that have been implemented and how the end users would like to make these better for them. 私たちは、新しい機能や特徴を持つアプリケーションを継続的に更新し、グレムリンが発見されていないことを確認するためのテストを行っています。必要な構成とパフォーマンスを継続的に維持できる環境で、アプリケーションを実行させています。
Reliability is a key factor here as well, at the end of the day we want our Application to be available all the time it is required. This then lends to other **observability, security and data management** areas that should be continuously monitored and feedback can always be used to better enhance, update and release the application continuously. しかし、今度は、エンドユーザーが必要とする体験を得られるかどうかを確認する必要があります。ここでは、アプリケーションのパフォーマンスが継続的に監視されていることを確認する必要があります。このフェーズでは、開発者が将来のリリースでアプリケーションを強化し、エンドユーザにより良いサービスを提供するためのより良い決定を下すことができるようになります。
Some input from the community here specifcally [@_ediri](https://twitter.com/_ediri) mentioned also part of this continous process we should also have the FinOps teams involved. Apps & Data are running and stored somewhere you should be monitoring this continously to make sure if things change from a resources point of view your costs are not causing some major financial pain on your Cloud Bills. また、このセクションでは、実装された機能と、エンドユーザがそれらをどのように改善したいかについてのフィードバックの輪を捉えるところでもあります。
I think it is also a good time to bring up the "DevOps Engineer" mentions above, albeit there are many DevOps Engineer positions in the wild that people hold, this is not really the ideal way of positioning the process of DevOps. What I mean is from speaking to others in the community the title of DevOps Engineer should not be the goal for anyone because really any position should be adopting DevOps processes and the culture explained here. DevOps should be used in many different positions such as Cloud-Native engineer/architect, virtualisation admin, cloud architect/engineer, infrastructure admin. This is to name a few but the reason for using DevOps Engineer above was really to highlight the scope or the process used by any of the above positions and more. 信頼性はここでも重要な要素です。結局のところ、私たちはアプリケーションが必要とされるときはいつでも利用できるようにしたいのです。これは、継続的に監視されるべき他の**監視性、セキュリティ、およびデータ管理**の分野につながり、フィードバックは常に、アプリケーションを継続的に強化、更新、およびリリースするために使用することができます。
## Resources この継続的なプロセスには、FinOpsチームも関与すべきであると、特に[@_ediri](https://twitter.com/_ediri)はコミュニティからのいくつかの意見に言及しました。アプリとデータはどこかで実行され、保存されています。リソースの観点から物事が変化した場合、そのコストがクラウド請求書に大きな財務的苦痛を与えていないことを確認するために、これを継続的に監視する必要があります。
I am always open to adding additional resources to these readme files as it is here as a learning tool. また、上記の「DevOpsエンジニア」の話を持ち出す良い機会だと思います。世間には多くのDevOpsエンジニアのポジションがありますが、これはDevOpsのプロセスを位置づける上で理想的な方法ではありません。私が言いたいのは、コミュニティの他の人たちと話すと、DevOpsエンジニアという肩書きは誰にとってもゴールではないはずだということです。なぜなら、本当にどんなポジションでも、DevOpsプロセスやここで説明したカルチャーを採用すべきだからです。DevOpsは、Cloud-Nativeエンジニア/アーキテクト、仮想化管理者、クラウドアーキテクト/エンジニア、インフラ管理者など、様々なポジションで使用されるべきものです。これはほんの一例ですが、上記のDevOps Engineerを使用した理由は、上記のポジションやその他のポジションで使用される範囲やプロセスを強調するためです。
無料版のDeepL翻訳www.DeepL.com/Translatorで翻訳しました。
My advice is to watch all of the below and hopefully you also picked something up from the text and explanations above. ## リソース
このReadmeファイルは学習用として用意したものなので、いつでも追加リソースを受け付けています。
私のアドバイスは、以下のビデオをすべて見ること、そして上記のテキストや説明から何かを感じ取っていただけることを願っています。
- [Continuous Development](https://www.youtube.com/watch?v=UnjwVYAN7Ns) I will also add that this is focused on manufacturing but the lean culture can be closely followed with DevOps. - [Continuous Development](https://www.youtube.com/watch?v=UnjwVYAN7Ns) I will also add that this is focused on manufacturing but the lean culture can be closely followed with DevOps.
- [Continuous Testing - IBM YouTube](https://www.youtube.com/watch?v=RYQbmjLgubM) - [Continuous Testing - IBM YouTube](https://www.youtube.com/watch?v=RYQbmjLgubM)
@ -81,4 +84,4 @@ My advice is to watch all of the below and hopefully you also picked something u
- [FinOps Foundation - What is FinOps](https://www.finops.org/introduction/what-is-finops/) - [FinOps Foundation - What is FinOps](https://www.finops.org/introduction/what-is-finops/)
- [**NOT FREE** The Phoenix Project: A Novel About IT, DevOps, and Helping Your Business Win](https://www.amazon.co.uk/Phoenix-Project-DevOps-Helping-Business-ebook/dp/B00AZRBLHO) - [**NOT FREE** The Phoenix Project: A Novel About IT, DevOps, and Helping Your Business Win](https://www.amazon.co.uk/Phoenix-Project-DevOps-Helping-Business-ebook/dp/B00AZRBLHO)
If you made it this far then you will know if this is where you want to be or not. See you on [Day 4](day04.md). ここまで来れば、ここが自分の居場所かどうかが分かるはずです。では、[4日目](day04.md)でお会いしましょう。

View File

@ -1,99 +1,98 @@
--- ---
title: '#90DaysOfDevOps - DevOps & Agile - Day 4' title: '#90DaysOfDevOps - DevOps & アジャイル - 4日目'
published: false published: false
description: 90DaysOfDevOps - DevOps & Agile description: 90DaysOfDevOps - DevOps & アジャイル
tags: 'devops, 90daysofdevops, learning' tags: 'devops, 90daysofdevops, learning'
cover_image: null cover_image: null
canonical_url: null canonical_url: null
id: 1048700 id: 1048700
--- ---
## DevOps & Agile ## DevOps & アジャイル
Do you know the difference between DevOps and Agile? They were formed as standalone concepts. But now the two terms are getting fused. DevOpsとアジャイルの違いをご存知でしょうかこれらは独立した概念として形成されていました。しかし、今ではこの2つの用語は融合しつつある。
In this post, we will examine the crucial differences between Agile and DevOps and find out why the two are connected so tightly. この記事では、アジャイルとDevOpsの決定的な違いを検証し、なぜこの2つが密接に結びついているのかを探っていきます。
I think a good place to start is understanding a little more about a common angle I have seen in learning this area and that is DevOps vs Agile, even though they have similar goals and processes. In this section, I am going to summarise this hopefully. 私がこの分野を学ぶ上でよく目にする角度、それはDevOps vs アジャイルで、両者は同じような目標とプロセスを持っているにもかかわらず、もう少し理解することから始めると良いと思います。このセクションでは、このことを希望的にまとめていこうと思います。
Let's start with definitions. まず、定義から始めましょう。
### Agile Development
Agile is an approach that focuses on delivering small results faster rather than releasing one big interaction of the product; software is developed in iterations. The team releases a new version every week or month with incremental updates. The final goal of Agile is to deliver an optimal experience to the end-users. ### アジャイル開発
アジャイルは、製品の1つの大きなインタラクションをリリースするのではなく、小さな成果をより早く提供することに焦点を当てたアプローチであり、ソフトウェアはイテレーションで開発される。チームは、毎週または毎月、インクリメンタルアップデートで新バージョンをリリースします。アジャイルの最終目標は、エンドユーザーに最適なエクスペリエンスを提供することです。
### DevOps ### DevOps
We have been covering this for the past few days with a few different ways of describing the end goals of DevOps. DevOps usually describes software development ここ数日、DevOpsの最終目標を説明するいくつかの異なる方法で、これを取り上げてきました。DevOpsは通常、ソフトウェア開発者と運用スペシャリストの協力に基づくソフトウェア開発とデリバリープラクティスを説明するものです。DevOpsの主な利点は、簡素化された開発プロセスを提供し、ミスコミュニケーションを最小限に抑えることです。
and delivery practices based on cooperation between software developers and operations specialists. The main DevOps benefits are delivering a simplified development process and minimising miscommunication.
## What is the difference between Agile and DevOps ## アジャイルとDevOpsの違いとは
The difference is mainly the preoccupations. Agile and DevOps have different preoccupations but they are helping each other. Agile wants short iteration, which is only possible with the automation that DevOps brings. Agile wants the customer to try a specific version and quickly give feedback which is only possible if DevOps make the creation of new environment easy. その違いは、主に先入観です。アジャイルとDevOpsは、それぞれ異なる関心事を持っていますが、互いに助け合っているのです。アジャイルは短いイテレーションを求めますが、それはDevOpsがもたらす自動化によってのみ可能になります。アジャイルは、顧客が特定のバージョンを試し、すぐにフィードバックを得られることを望んでいるが、これはDevOpsが新しい環境の作成を容易にする場合にのみ可能です。
### Different participants ### 参加者が異なる
Agile focuses on optimising communication between end-users and developers while DevOps targets developers and operation team members. We could say that agile is outward-oriented towards customers whereas DevOps is a set of internal practices. アジャイルはエンドユーザーと開発者のコミュニケーションの最適化に焦点を当て、DevOpsは開発者と運用チームメンバーを対象としています。アジャイルは顧客に対して外向きであるのに対し、DevOpsは社内向けのプラクティスの集合体であるとも言えます。
### Team ### チーム
Agile usually applies to software developers and project managers. The competencies of DevOps engineers lie in the intersection of development, QA (quality assurance) and operations as they are involved in all stages of the product cycle and are part of the Agile team. アジャイルは通常、ソフトウェア開発者とプロジェクトマネージャに適用されます。DevOpsエンジニアのコンピテンシーは、製品サイクルの全段階に関わり、アジャイルチームの一員であることから、開発、QA品質保証、運用が交差するところにあります。
### Applied Frameworks ### 応用フレームワーク
Agile has a lot of management frameworks to achieve flexibility and transparency: Scrum > Kanban > Lean > Extreme > Crystal > Dynamic > Feature-Driven. DevOps focuses on the development approach in collaboration but doesn't offer specific methodologies. However, DevOps promote practices like Infrastructure as Code, Architecture as Code, Monitoring, Self Healing, end to end test automation ... But per se this is not a framework, rather practices. アジャイルには、柔軟性と透明性を実現するために、多くのマネジメントフレームワークがあります。スクラムカンバンリーンエクストリームクリスタルダイナミックフィーチャードリブン。DevOpsは、コラボレーションによる開発アプローチに焦点を当てていますが、特定の方法論を提供しているわけではありません。しかし、DevOpsは、Infrastructure as Code、Architecture as Code、モニタリング、セルフヒーリング、エンドツーエンドのテスト自動化などのプラクティスを推進している。しかし、それ自体はフレームワークではなく、むしろプラクティスです。
### Feedback ### フィードバック
In Agile the main source of feedback is the end-user while in DevOps the feedback from stakeholders and the team itself has a higher priority. アジャイルではエンドユーザーが主なフィードバック元ですが、DevOpsではステークホルダーやチーム自身からのフィードバックがより優先されます。
### Target areas ### 対象領域
Agile focuses on software development more than on deployment and maintenance. DevOps focuses on software development as well but its values and tools also cover deployment and post-release stages like monitoring, high availability, security and data protection. アジャイルは、デプロイメントやメンテナンスよりも、ソフトウェア開発に重点を置いています。DevOpsはソフトウェア開発にも焦点を当てていますが、その価値観やツールはデプロイメントやモニタリング、高可用性、セキュリティ、データ保護などのリリース後の段階も対象としています。
### Documentation ### ドキュメンテーション
Agile prioritises flexibility and tasks at hand over documentation and monitoring. DevOps on the other hand regards project documentation as one of the essential project components. アジャイルは、文書化や監視よりも、柔軟性や目の前の作業を優先します。一方、DevOpsでは、プロジェクトのドキュメンテーションをプロジェクトの重要な構成要素の1つとみなしています。
### Risks ### リスク
Agile risks derive from the flexibility of the methodology. Agile projects are difficult to predict or evaluate as priorities and requirements are continually changing. アジャイルのリスクは、その手法の柔軟性に由来する。アジャイルプロジェクトは、優先順位と要件が絶えず変化するため、予測や評価が困難です。
DevOps risks derive from a misunderstanding of the term and the lack of suitable tools. Some people see DevOps as a collection of software for the deployment and continuous integration failing to change the underlying structure of the development process. DevOpsのリスクは、この用語の誤解と適切なツールの欠如に由来する。一部の人々は、DevOpsをデプロイと継続的統合のためのソフトウェアの集合体であり、開発プロセスの根本的な構造を変更することはできないと見ています。
### The Tools Used ### 使用ツール
Agile tools are focused on management communication collaboration, metrics and feedback processing. The most popular agile tools include JIRA, Trello, Slack, Zoom, SurveyMonkey and others. アジャイルツールは、管理コミュニケーションコラボレーション、メトリクス、フィードバック処理に重点を置いています。最も人気のあるアジャイルツールには、JIRA、Trello、Slack、Zoom、SurveyMonkeyなどがあります。
DevOps uses tools for team communication, software development, deployment and integration like Jenkins, GitHub Actions, BitBucket, etc. Even though agile and DevOps have slightly different focuses and scopes the key values are almost identical, therefore you can combine the two. DevOpsは、Jenkins、GitHub Actions、BitBucketなど、チームコミュニケーション、ソフトウェア開発、デプロイメント、統合のためのツールを使用します。アジャイルとDevOpsは、フォーカスやスコープが若干異なるものの、重要な価値はほぼ同じであるため、この2つを組み合わせることが可能です。
## Bring it all together… good idea or not? Discuss? ## 全部まとめて...良いアイデアかどうか?議論する?
The combination of Agile and DevOps brings the following benefits you will get: アジャイルとDevOpsの組み合わせは、あなたが得られる次のようなメリットをもたらします。
- Flexible management and powerful technology. - 柔軟なマネジメントと強力なテクノロジー。
- Agile practices help DevOps teams to communicate their priorities more efficiently. - アジャイルプラクティスは、DevOpsチームが優先順位をより効率的に伝えるのに役立ちます。
- The automation cost that you have to pay for your DevOps practices is justified by your agile requirement of deploying quickly and frequently. - DevOpsプラクティスのために支払わなければならない自動化コストは、迅速かつ頻繁にデプロイするというアジャイルの要件によって正当化される。
- It leads to strengthening: the team adopting agile practices will improve collaboration, increase the team's motivation and decrease employee turnover rates. - 強化につながる:アジャイルプラクティスを採用したチームは、コラボレーションを改善し、チームのモチベーションを高め、従業員の離職率を低下させることができます。
- As a result, you get better product quality. - その結果、より良い製品品質を得ることができる。
Agile allows coming back to previous product development stages to fix errors and prevent the accumulation of technical debt. To adopt agile and DevOps アジャイルでは、以前の製品開発段階まで戻ってエラーを修正し、技術的負債の蓄積を防ぐことができます。アジャイルとDevOpsを同時に採用するには、7つのステップを踏むだけです。
simultaneously just follow 7 steps:
1. Unite the development and operation teams. 1. 開発チームと運用チームを統合する
2. Create build and run teams, all development and operational concerns are discussed by the entire DevOps team. 2. ビルドと実行のチームを作り、開発と運用に関するすべての問題をDevOpsチーム全体で議論する
3. Change your approach to sprints, and assign priority ratings to offer DevOps tasks that have the same value than development tasks. Encourage development and operations teams to exchange their opinion on other teams workflow and possible issues. 3. スプリントへのアプローチを変更し、開発タスクと同じ価値を持つDevOpsタスクを提供するために、優先順位の評価を割り当てる。開発チームと運用チームが、他のチームのワークフローや考えられる問題について意見交換することを奨励する
4. Include QA in all development stages. 4. QAをすべての開発ステージに含める
5. Choose the right tools. 5. 適切なツールを選択する
6. Automate everything you can. 6. できることはすべて自動化する
7. Measure and control by using tangible numeric deliverables. 7. 具体的な数値化された成果物を用いて、測定と管理を行う
What do you think? Do you have different views? I want to hear from Developers, Operations, QA or anyone that has a better understanding of Agile and DevOps that can pass comments and feedback on this? いかがでしょうか異なる見解を持っていますか開発者、オペレーション、QA、あるいは、アジャイルとDevOpsをよりよく理解している人からのコメントやフィードバックが欲しいのですが。
### Resources ### リソース
- [DevOps for Developers Day in the Life: DevOps Engineer in 2021](https://www.youtube.com/watch?v=2JymM0YoqGA) - [DevOps for Developers Day in the Life: DevOps Engineer in 2021](https://www.youtube.com/watch?v=2JymM0YoqGA)
- [3 Things I wish I knew as a DevOps Engineer](https://www.youtube.com/watch?v=udRNM7YRdY4) - [3 Things I wish I knew as a DevOps Engineer](https://www.youtube.com/watch?v=udRNM7YRdY4)
- [How to become a DevOps Engineer feat. Shawn Powers](https://www.youtube.com/watch?v=kDQMjAQNvY4) - [How to become a DevOps Engineer feat. Shawn Powers](https://www.youtube.com/watch?v=kDQMjAQNvY4)
If you made it this far then you will know if this is where you want to be or not. See you on [Day 5](day05.md). ここまで来れば、ここが自分の望むところかどうかが分かるはずです。[5日目](day05.md)でお会いしましょう。

View File

@ -1,85 +1,85 @@
--- ---
title: '#90DaysOfDevOps - Plan > Code > Build > Testing > Release > Deploy > Operate > Monitor > - Day 5' title: '#90DaysOfDevOps - DevOps ライフサイクル - 計画から監視まで - 5日目'
published: false published: false
description: 90DaysOfDevOps - Plan > Code > Build > Testing > Release > Deploy > Operate > Monitor > description: 90DaysOfDevOps - DevOps ライフサイクル - 計画から監視まで
tags: "devops, 90daysofdevops, learning" tags: "devops, 90daysofdevops, learning"
cover_image: null cover_image: null
canonical_url: null canonical_url: null
id: 1048830 id: 1048830
--- ---
## Plan > Code > Build > Testing > Release > Deploy > Operate > Monitor > ## 計画 > コーディング > ビルド > テスト > リリース > デプロイ > オペレート > 監視
Today we are going to focus on the individual steps from start to finish and the continous cycle of an Application in a DevOps world. 今日は、DevOpsの世界におけるアプリケーションのスタートからゴールまでの個々のステップと継続的なサイクルに焦点を当てます。
![DevOps](Images/Day5_DevOps8.png) ![DevOps](Images/Day5_DevOps8.png)
### Plan: ### 計画:
It all starts with the planning process this is where the development team gets together and figure out what types of features and bug fixes that they're going to roll out in their next sprint. This is an opportunity as a DevOps Engineer for you to get involved with that and learn what kinds of things are going to be coming your way that you need to be involved with and also influence their decisions or their path and kind of help them work with the infrastructure that you've built or steer them towards something that's going to work better for them in case they're not on that path and so one key thing to point out here is the developers or software engineering team is your customer as a DevOps engineer so this is your opportunity to work with your customer before they go down a bad path. これは、開発チームが集まって、次のスプリントでどのような種類の機能やバグフィックスを展開するかを考える、計画プロセスから始まるものです。これはDevOpsエンジニアとして、このプロセスに参加し、どのような種類のものが自分のところにやってくるのかを知り、自分が関与する必要があります。また、彼らの決定や進路に影響を与え、あなたが構築したインフラで彼らが作業できるようにしたり、彼らがその進路にいない場合にもっとうまくいくものに舵を切ることができます。ここで一つ重要なことは、開発者またはソフトウェアエンジニアリングチームはDevOpsエンジニアとしてあなたのお客様であり、これは彼らが悪い方向に行く前にお客様と作業する機会です。
### Code: ### コーディング:
Now once that planning session's done they're going to go start writing the code you may or may not be involved a whole lot with this one of the places you may get involved with it, is whenever they're writing code you can help them better understand the infrastructure so if they know what services are available and how to best talk with those services so they're going to do that and then once they're done they'll merge that code into the repository 計画セッションが終わると、彼らはコードを書き始めます。あなたがこの作業に関わることが多いか少ないかは別として、彼らがコードを書いているときに、インフラについてよりよく理解する手助けをすることができます。
### Build: ### ビルド:
This is where we'll kick off the first of our automation processes because we're going to take their code and we're going to build it depending on what language they're using it may be transpiling it or compiling it or it might be creating a docker image from that code either way we're going to go through that process using our ci cd pipeline ここで、自動化プロセスの第一段階を開始します。彼らのコードを受け取り、彼らが使っている言語に応じて、トランスパイルやコンパイル、あるいはそのコードからDockerイメージを作成するなど、CI/CDパイプラインを使ってそのプロセスを進めていくのです。
## Testing: ## テスト:
Once we've built it we're going to run some tests on it now the development team usually writes the test you may have some input in what tests get written but we need to run those tests and the testing is a way for us to try and minimise introducing problems out into production, it doesn't guarantee that but we want to get as close to a guarantee as we can that were one not introducing new bugs and two not breaking things that used to work 開発チームがテストを書くのが普通ですが、どのようなテストを書くかについて、あなたが何らかの意見を述べることもあるでしょう。しかし、私たちはテストを実行する必要があります。テストは、本番環境に問題を持ち込むのを最小限に抑えようとする方法であり、保証するものではありませんが、できる限り保証に近い形で、新しいバグを出さない、以前動いていたものを壊さないようにしたいのです。
## Release: ## リリース:
Once those tests pass we're going to do the release process and depending again on what type of application you're working on this may be a non-step. You know the code may just live in the GitHub repo or the git repository or wherever it lives but it may be the process of taking your compiled code or the docker image that you've built and putting it into a registry or a repository where it's accessible by your production servers for the deployment process テストに合格したら、リリース処理を行うことになりますが、どのようなアプリケーションに取り組んでいるかによって、これは非段階的なものになるかもしれません。コードは GitHub リポジトリや git リポジトリなど、どこにでも置いておけますが、コンパイルしたコードや docker イメージをレジストリやリポジトリに置いて、本番サーバーからアクセスできるようにし、デプロイ処理を行うことがあります。
## Deploy: ## デプロイ:
which is the thing that we do next because deployment is like the end game of this whole thing because deployments when we put the code into production and it's not until we do that that our business actually realizes the value from all the time effort and hard work that you and the software engineering team have put into this product up to this point. デプロイはコードを本番稼動させるときの最終ゲームのようなもので、デプロイを行うまでは、あなたとソフトウェアエンジニアリングチームがこの時点までに製品に費やしたすべての時間と努力の価値を、私たちのビジネスは実際に実感することができないからです。
## Operate: ## オペレート:
Once it's deployed we are going to operate it and operate it may involve something like you start getting calls from your customers that they're all annoyed that the site's running slow or their application is running slow right so you need to figure out why that is and then possibly build auto-scaling you know to handle increase the number of servers available during peak periods and decrease the number of servers during off-peak periods either way that's all operational type metrics, another operational thing that you do is include like a feedback loop from production back to your ops team letting you know about key events that happened in production such as a deployment back one step on the deployment thing this may or may not get automated depending on your environment the goal is to always automate it when possible there are some environments where you possibly need to do a few steps before you're ready to do that but ideally you want to deploy automatically as part of your automation process but if you're doing that it might be a good idea to include in your operational steps some type of notification so that your ops team knows that a deployment has happened デプロイされた後、私たちはそれを運用することになります。運用には、例えば顧客からサイトの動作が遅い、アプリケーションの動作が遅いと いった問い合わせを受けるようになり、その原因を突き止め、ピーク時にはサーバーの数を増やし、オフピーク時にはサーバー の数を減らすといった自動スケーリングを組み込む必要があるかもしれません。もう一つの運用上の工夫として、運用チームから運用チームへのフィードバック・ループを設け、運用中に発生した重要なイベント、例えばデプロイメントを一段階前に戻すといったことを知らせます。これは環境に応じて自動化することもしないこともできますが、可能であれば常に自動化することが目標です。しかし、理想的には、自動化プロセスの一環として自動的にデプロイしたいものです。その場合、運用ステップに何らかの通知を含めて、デプロイが行われたことを運用チームに知らせるのがよいでしょう。
## Monitor: ## 監視:
All of the above parts lead to the final step because you need to have monitoring, especially around operational issues auto-scaling troubleshooting like you don't know 特に運用上の問題やオートスケーリング・トラブルシューティングなどでは、モニタリングが必要です。
there's a problem if you don't have monitoring in place to tell you that there's a problem so some of the things you might build monitoring for are memory utilization CPU utilization disk space, api endpoint, response time, how quickly that endpoint is responding and a big part of that as well is logs. Logs give developers the ability to see what is happening without having to access production systems. 例えば、メモリ使用量、CPU使用量、ディスク容量、APIエンドポイント、応答時間、エンドポイントの応答速度、そしてログがその大きな要素です。ログがあれば、開発者は本番システムにアクセスすることなく、何が起きているのかを確認することができます。
## Rince & Repeat: ## リンス・アンド・リピート:
Once that's in place you go right back to the beginning to the planning stage and go through the whole thing again それができたら、また最初に戻って企画を練り直し、全部をやり直します。
## Continuous: ## 継続的に
Many tools help us achieve the above continuous process, all this code and the ultimate goal of being completely automated, cloud infrastructure or any environment is often described as Continuous Integration/ Continuous Delivery/Continous Deployment or “CI/CD” for short. We will spend a whole week on CI/CD later on in the 90 Days with some examples and walkthroughs to grasp the fundamentals. 多くのツールが、上記のような継続的なプロセス、これらすべてのコード、そして完全に自動化された最終的な目標、クラウドインフラやあらゆる環境を実現するために、しばしば継続的インテグレーション/継続的デリバリー/継続的デプロイメント、略して「CI/CD」と表現されます。CI/CDについては、90日の後半で1週間かけて、いくつかの例とウォークスルーで基本を把握する予定です。
### Continuous Delivery: ### 継続的デリバリー:
Continuous Delivery = Plan > Code > Build > Test 継続的デリバリー = 計画 > コーディング > ビルド > テスト
### Continuous Integration: ### 継続的インテグレーション:
This is effectively the outcome of the Continuous Delivery phases above plus the outcome of the Release phase. This is the case for both failure and success but this is fed back into continuous delivery or moved to Continuous Deployment. これは事実上、上記の継続的デリバリーフェーズの結果に、リリースフェーズの結果を加えたものです。これは失敗の場合も成功の場合も同じですが、これを継続的デリバリーにフィードバックしたり、継続的デプロイメントに移行したりします。
Continuous Integration = Plan > Code > Build > Test > Release 継続的インテグレーション=計画>コード>ビルド>テスト>リリース
### Continuous Deployment: ### 継続的デプロイメント:
If you have a successful release from your continuous integration then move to Continuous Deployment which brings in the following phases 継続的インテグレーションからのリリースが成功したら、継続的デプロイメントに移行し、次のフェーズに進みます。
CI Release is Success = Continuous Deployment = Deploy > Operate > Monitor CIリリースが成功した場合 = 継続的デプロイメント = デプロイ > 運用 > 監視
You can see these three Continuous notions above as the simple collection of phases of the DevOps Lifecycle. 上記の3つの継続的概念は、DevOpsライフサイクルのフェーズの単純な集合体であることがお分かりいただけると思います。
This last bit was a bit of a recap for me on Day 3 but think this actually makes things clearer for me. この最後の部分は、3日目の復習のようなものでしたが、これで実際に物事が明確になったと思っています。
### Resources: ### リソース:
- [DevOps for Developers Software or DevOps Engineer?](https://www.youtube.com/watch?v=a0-uE3rOyeU) - [DevOps for Developers Software or DevOps Engineer?](https://www.youtube.com/watch?v=a0-uE3rOyeU)
- [Techworld with Nana -DevOps Roadmap 2022 - How to become a DevOps Engineer? What is DevOps? ](https://www.youtube.com/watch?v=9pZ2xmsSDdo&t=125s) - [Techworld with Nana -DevOps Roadmap 2022 - How to become a DevOps Engineer? What is DevOps? ](https://www.youtube.com/watch?v=9pZ2xmsSDdo&t=125s)
- [How to become a DevOps Engineer in 2021 - DevOps Roadmap](https://www.youtube.com/watch?v=5pxbp6FyTfk) - [How to become a DevOps Engineer in 2021 - DevOps Roadmap](https://www.youtube.com/watch?v=5pxbp6FyTfk)
If you made it this far then you will know if this is where you want to be or not. ここまで来れば、ここが自分の居場所かどうかが分かるはずです。
See you on [Day 6](day06.md). では、[6日目](day06.md)でお会いしましょう。

View File

@ -16,13 +16,13 @@ English Version | [中文版本](zh_cn/README.md) | [繁體中文版本](zh_tw/R
## 進捗 ## 進捗
- [✔️] ♾️ 1 > [Introduction](Days/day01.md) - [✔️] ♾️ 1 > [はじめに](Days/day01.md)
### DevOpsとは何か、なぜ使うのか ### DevOpsとは何か、なぜ使うのか
- [✔️] ♾️ 2 > [Responsibilities of a DevOps Engineer](Days/day02.md) - [✔️] ♾️ 2 > [DevOpsエンジニアの責務](Days/day02.md)
- [✔️] ♾️ 3 > [DevOps Lifecycle - Application Focused](Days/day03.md) - [✔️] ♾️ 3 > [DevOps ライフサイクル - アプリケーションフォーカス](Days/day03.md)
- [✔️] ♾️ 4 > [DevOps & Agile](Days/day04.md) - [✔️] ♾️ 4 > [DevOps & アジャイル](Days/day04.md)
- [✔️] ♾️ 5 > [Plan > Code > Build > Testing > Release > Deploy > Operate > Monitor >](Days/day05.md) - [✔️] ♾️ 5 > [Plan > Code > Build > Testing > Release > Deploy > Operate > Monitor >](Days/day05.md)
- [✔️] ♾️ 6 > [DevOps - The real stories](Days/day06.md) - [✔️] ♾️ 6 > [DevOps - The real stories](Days/day06.md)

View File

@ -1,7 +1,7 @@
--- ---
title: '#90DaysOfDevOps - Introduction - Day 1' title: '#90DaysOfDevOps - 简介 - 第一天'
published: false published: false
description: 90DaysOfDevOps - Introduction description: 90DaysOfDevOps - 简介
tags: "devops, 90daysofdevops, learning" tags: "devops, 90daysofdevops, learning"
cover_image: null cover_image: null
canonical_url: null canonical_url: null
@ -10,70 +10,53 @@ id: 1048731
## 简介 - Day 1 ## 简介 - Day 1
在90天中的第一天我们开始学习DevOps的基本理解和工具。这些可以有助于建立DevOps的思维方式。 在90天中的第一天我们开始学习DevOps的基本理解和工具。这些可以有助于建立DevOps的思维方式。
<!-- Day 1 of our 90 days and adventure to learn a good foundational understanding of DevOps and tools that help with a DevOps mindset. -->
几年前,我开始学习相关内容,但我的关注于虚拟化平台和基于云的技术,主要研究基础设施即代码 (Infrastructure as Code, [IaC](https://www.ibm.com/cloud/learn/infrastructure-as-code)) 和Terraform和Chef的应用程序配置管理。 几年前,我开始学习相关内容,但我的关注于虚拟化平台和基于云的技术,主要研究基础设施即代码 (Infrastructure as Code, [IaC](https://www.ibm.com/cloud/learn/infrastructure-as-code)) 和Terraform和Chef的应用程序配置管理。
<!-- This learning journey started for me a few years back but my focus then was around virtualisation platforms and cloud based technologies, I was looking mostly into Infrastructure as Code and Application configuration management with Terraform and Chef. -->
接着来到2021年3月我得到了一个可以将精力集中在Veeam的Kasten的云原生部署上的机会。这个项目专注于Kubernetes和DepOps以及相关的社区。在开始学习后我很快发现在除了Kubernetes和容器化的基础知识那里还有一个非常广阔的世界。我开始在社区中交流学习更多关于DevOps文化、工具和流程最终我想公开地分享这些想法。 接着来到2021年3月我得到了一个可以将精力集中在Veeam的Kasten的云原生部署上的机会。这个项目专注于Kubernetes和DepOps以及相关的社区。在开始学习后我很快发现在除了Kubernetes和容器化的基础知识那里还有一个非常广阔的世界。我开始在社区中交流学习更多关于DevOps文化、工具和流程最终我想公开地分享这些想法。
<!-- Fast forward to March 2021, I was given an amazing opportunity to concentrate my efforts around the Cloud Native strategy at Kasten by Veeam. Which was going to be a massive focus on Kubernetes and DevOps and the community surrounding these technologies. I started my learning journey and quickly realised there was a very wide world aside from just learning the fundamentals of Kubernetes and Containerisation and it was then when I started speaking to the community and learning more and more about the DevOps culture, tooling and processes so I started documenting some of the areas I wanted to learn in public. -->
[So you want to learn DevOps?](https://blog.kasten.io/devops-learning-curve) [So you want to learn DevOps?](https://blog.kasten.io/devops-learning-curve)
## 开始我们的旅程吧 ## 开始我们的旅程吧
如果你阅读了以上的博客,你会发现这是我学习过程中的进阶内容。我认为我并不是以上以上任一领域的专家,但我希望分享一些免费和需付费的资源,我们可以按需选择。 如果你阅读了以上的博客,你会发现这是我学习过程中的进阶内容。我认为我并不是以上以上任一领域的专家,但我希望分享一些免费和需付费的资源,我们可以按需选择。
<!-- If you read the above blog you will see this is a high level contents for my learning journey and I will say at this point I am no where near an expert in any of these sections but what I wanted to do was share some resources both FREE and some paid for but an option for both as we all have different circumstances. -->
在接下来的90天里我想记录这些资料并涵盖那些基础领域。我希望社区参与进来分享你的相关经历和资源以便我们一起学习共同进步。 在接下来的90天里我想记录这些资料并涵盖那些基础领域。我希望社区参与进来分享你的相关经历和资源以便我们一起学习共同进步。
<!-- Over the next 90 days I want to document these resources and cover those foundational areas, I would love for the community to also get involved share your journey and resources so we can learn in public and help each other. -->
在该项目开头的README中你会了解到我已经将内容拆分成多个小节基本上是由12周加6天组成。前6天我们会大致探讨DevOps的基础后续再深入到一些特定领域。这份清单不是完美的再次希望社区参与进来并一起帮助它成为有用的资源。 在该项目开头的README中你会了解到我已经将内容拆分成多个小节基本上是由12周加6天组成。前6天我们会大致探讨DevOps的基础后续再深入到一些特定领域。这份清单不是完美的再次希望社区参与进来并一起帮助它成为有用的资源。
<!-- You will see from the opening readme in the project repository that I have split things into sections and it is basically 12 weeks plus 6 days. The first 6 days we will explore the fundamentals of DevOps in general before diving into some of the specific areas, by no way is this list exhaustive and again would love for the community to assist in making this a useful resource. -->
在这里我会分享另一个资源,我认为每个人都应认真了解,或是根据自身需求制作自己的思维导图,它的地址如下: 在这里我会分享另一个资源,我认为每个人都应认真了解,或是根据自身需求制作自己的思维导图,它的地址如下:
<!-- Another resource I will share at this point that I think everyone should have a good look at and maybe create your own mind map for yourself and your interest and position is the following: -->
[DevOps Roadmap](https://roadmap.sh/devops) [DevOps Roadmap](https://roadmap.sh/devops)
当我在创建这个初始清单和博客的时候我发现这是个很好的资源。你也可以看到除了在我列出的12个专题以外的其他领域更详细的信息。 当我在创建这个初始清单和博客的时候我发现这是个很好的资源。你也可以看到除了在我列出的12个专题以外的其他领域更详细的信息。
<!-- I found this a great resource when I was creating my initial list and blog post on this topic. You can also see there are other areas that go into a lot more detail outside of the 12 topics I have listed here in this repository. -->
## 第一步 - 什么是 DevOps? ## 第一步 - 什么是 DevOps?
这里可以列出很多的博客和YouTube视频但作为90天挑战的开始以及我们每天花费约一小时来学习一些新的或关于DevOps的东西。我觉得从宏观的“什么是DevOps”开始是个不错的选择。 这里可以列出很多的博客和YouTube视频但作为90天挑战的开始以及我们每天花费约一小时来学习一些新的或关于DevOps的东西。我觉得从宏观的“什么是DevOps”开始是个不错的选择。
<!-- There are so many blog articles and YouTube videos to list here, but as we start the 90 day challenge and we focus on spending around an hour a day learning something new or about DevOps I thought it was good to get some of the high level of "what DevOps is" down to begin. -->
首先DevOps不是工具。你不能购买它它不是可下载的软件sku或开源GitHub仓库。DevOps也不是编程语言或什么黑魔法。 首先DevOps不是工具。你不能购买它它不是可下载的软件sku或开源GitHub仓库。DevOps也不是编程语言或什么黑魔法。
<!-- Firstly, DevOps is not a tool. You cannot buy it, it is not a software sku or an open source GitHub repository you can download. It is also not a programming language, it is also not some dark art magic either. -->
DevOps是一种在软件开发中更明智的做事方式。- 等一下... 但如果你不是一个软件开发人员,你现在应该关闭这个页面并离开吗??不,继续读下去... 因为DevOps将软件开发和运维运营结合在了一起。我先前提及到我更多关注的是虚拟机方面的工作而这些通常属于运营。但在社区中不同背景的人们都可以通过更好地了解DevOps来学习那些实践案例。DevOps将100%造福于个人、开发者、运维运营和QA工程师。 DevOps是一种在软件开发中更明智的做事方式。- 等一下... 但如果你不是一个软件开发人员,你现在应该关闭这个页面并离开吗??不,继续读下去... 因为DevOps将软件开发和运维运营结合在了一起。我先前提及到我更多关注的是虚拟机方面的工作而这些通常属于运营。但在社区中不同背景的人们都可以通过更好地了解DevOps来学习那些实践案例。DevOps将100%造福于个人、开发者、运维运营和QA工程师。
<!-- DevOps is a way to do smarter things in Software Development. - Hold up... But if you are not a software developer should you turn away right now and not dive into this project??? No Not at all, Stay... Because DevOps brings together a combination of software development and operations. I mentioned earlier that I was more on the VM side and that would generally fall under the Operations side of the house, but within the community there are people with all different backgrounds where DevOps is 100% going to benefit the individual, Developers, Operations and QA Engineers all can equally learn these best practices by having a better understanding of DevOps. -->
DevOps是一系列有助于达成这一目标的实践减少产品从构思到发布阶段到最终用户或内部团队或客户的任何人所需要的时间。 DevOps是一系列有助于达成这一目标的实践减少产品从构思到发布阶段到最终用户或内部团队或客户的任何人所需要的时间。
<!-- DevOps is a set of practices that help to reach the goal of this movement: reducing the time between the ideation phase of a product and its release in production to the end-user or whomever it could be an internal team or customer. -->
在这第一个星期,我们将展开讨论**敏捷方法论**(The Agile Methodology)。DevOps和Agile是被广泛使用的方法为的是实现**应用程序app**的持续迭代更新。 在这第一个星期,我们将展开讨论**敏捷方法论**(The Agile Methodology)。DevOps和Agile是被广泛使用的方法为的是实现**应用程序app**的持续迭代更新。
<!-- Another area we will dive into in this first week is around **The Agile Methodology** DevOps and Agile are widely adopted together to achieve continuous delivery of your **Application** -->
宏观层次的收获是DevOps的思维方式是将漫长的软件发布过程从可能几年的时间拆分成更频繁的、较小的多次发布。另一个关键点是DevOps打破了团队间的隔阂开发人员、运维运营人员和QA工程师。 宏观层次的收获是DevOps的思维方式是将漫长的软件发布过程从可能几年的时间拆分成更频繁的、较小的多次发布。另一个关键点是DevOps打破了团队间的隔阂开发人员、运维运营人员和QA工程师。
<!-- The high level take away is with a DevOps mindset or culture its about taking a way the long drawn out software release process from potentially years to being able to drop smaller releases more frequently. The other key fundamental to take away here is it's about breaking down silos between the teams I previously mentioned, Developers, Operations and QA. -->
从DevOps的角度**开发、测试、部署**都属于DevOps团队。 从DevOps的角度**开发、测试、部署**都属于DevOps团队。
<!-- From a DevOps perspective, **Development, Testing and Deployment** all land with the DevOps team. -->
最后一点,我们必须通过**自动化**使得整个过程尽可能有效和高效。 最后一点,我们必须通过**自动化**使得整个过程尽可能有效和高效。
<!-- The final point I will make is to make this as effective and efficient as possible we must leverage **Automation** -->
<!-- ## 信息来源 --> ## 相关资料
## Resources
I am always open to adding additional resources to these readme files as it is here as a learning tool. 我始终欢迎大家在readme文件中添加资料将它作为一个学习工具。
My advice is to watch all of the below and hopefully you also picked something up from the text and explanations above. 我的建议是浏览下面的内容,希望你也能从文字解释中有所收获。
- [DevOps in 5 Minutes](https://www.youtube.com/watch?v=Xrgk023l4lI) - [DevOps in 5 Minutes](https://www.youtube.com/watch?v=Xrgk023l4lI)
- [What is DevOps? Easy Way](https://www.youtube.com/watch?v=_Gpe1Zn-1fE&t=43s) - [What is DevOps? Easy Way](https://www.youtube.com/watch?v=_Gpe1Zn-1fE&t=43s)
- [DevOps roadmap 2022 | Success Roadmap 2022](https://www.youtube.com/watch?v=7l_n97Mt0ko) - [DevOps roadmap 2022 | Success Roadmap 2022](https://www.youtube.com/watch?v=7l_n97Mt0ko)
If you made it this far then you will know if this is where you want to be or not. See you on [Day 2](day02.md). 如果你已看到这里那么你已知道是否要继续学习DevOps了。[第二天](day02.md)见。

View File

@ -1,60 +1,61 @@
--- ---
title: '#90DaysOfDevOps - The Big Picture: Learning a Programming Language - Day 7' title: '#90DaysOfDevOps - 概述DevOps 与学习一门编程语言 - 第七天'
published: false published: false
description: 90DaysOfDevOps - The Big Picture DevOps & Learning a Programming Language description: 90DaysOfDevOps - 概述DevOps 与学习一门编程语言
tags: 'devops, 90daysofdevops, learning' tags: 'devops, 90daysofdevops, learning'
cover_image: null cover_image: null
canonical_url: null canonical_url: null
id: 1048856 id: 1048856
--- ---
## The Big Picture: DevOps & Learning a Programming Language ## 概述DevOps与学习一门编程语言
I think it is fair to say to be successful in the long term as a DevOps engineer you've got to know at least one programming language at a foundational level. I want to take this first session of this section to explore why this is such a critical skill to have, and hopefully, by the end of this week or section, you are going to have a better understanding of the why, how and what to do to progress with your learning journey. 我认为要想在长期成为一名成功的DevOps工程师你需要了解至少一种编程语言的基本用法。我想通过本节的第一部分来探究为什么这是一项重要的技能。同时也希望在本周或本节结束时你会更好地理解在这个学习过程中为什么、如何做和做什么。
I think if I was to ask out on social do you need to have programming skills for DevOps related roles, the answer will be most likely a hard yes? Let me know if you think otherwise? Ok but then a bigger question and this is where you won't get such a clear answer is which programming language? The most common answer I have seen here has been Python or increasingly more often, we're seeing Golang or Go should be the language that you learn. 我想如果我在社交平台上提问从事与DevOps的相关工作是否需要具备编程能力得到的答案很可能是肯定的如果你不这么认为欢迎告诉我。但一个更大的问题是你无法明确知道所需的是哪种编程语言一个最常见的回答是我看到Python正变得越来越热门而我们应该选择学习Golang或Go语言。
To be successful in DevOps you have to have a good knowledge of programming skills is my takeaway from that at least. But we have to understand why we need it to choose the right path. 为了在DevOps中获得成功至少在我看来你应该具备良好的编程知识。但我们也该知道为什么我们需要它来选择正确的方向。
## Understand why you need to learn a programming language. ## 了解为什么需要学习编程语言
The reason that Python and Go are recommended so often for DevOps engineers is that a lot of the DevOps tooling is written in either Python or Go, which makes sense if you are going be build DevOps tools. Now this is important as this will determine really what you should learn and that would likely be the most beneficial. If you are going to be building DevOps tools or you are joining a team that does then it would make sense to learn that same language, if you are going to be heavily involved in Kubernetes or Containers then it's more than likely that you would want to choose Go as your programming language. For me, the company I work for (Kasten by Veeam) is in the Cloud-Native ecosystem focused on data management for Kubernetes and everything is written in Go. 许多DevOps的工具是用Python或Go编写的如果你要构建DevOps工具这将为你提供便利。这也成为了DevOps推荐Python和Go的原因。如今这会影响你决定学习哪一种编程语言并可能是对你最有益的。如果你想构建DevOps工具或是加入一个从事相关工作的团队选择学习与之相同的语言将是有意义的。如果你需要大量使用Kubernetes或Containers那么你很可能会将Go作为你的编程语言。对我来说我工作的公司(Kasten by Veeam) 位于云原生态系统领域(Cloud-Native ecosystem)专注于Kubernetes的数据管理并且所有工作都用Go来编写。
But then you might not have clear cut reasoning like that to choose you might be a student or transitioning careers with no real decision made for you. I think in this situation then you should choose the one that seems to resonate and fit with the applications you are looking to work with. 但或许你是一名学生或过渡职业,可能没有像这样明确的方向来帮助你做出选择。我觉得在这个情况下,你应该选择一个与你感兴趣的应用程序有相近特点的。
Remember I am not looking to become a software developer here I just want to understand a little more about the programming language so that I can read and understand what those tools are doing and then that leads to possibly how we can help improve things. 请记住,我在这里并不是为了成为一名程序开发者。我只是想去更多地了解编程语言,从而让我能够阅读和理解那些工具在做些什么,进而有可能启发我们如何改进相关的工作。
I would also it is also important to know how you interact with those DevOps tools which could be Kasten K10 or it could be Terraform and HCL. These are what we will call config files and this is how you interact with those DevOps tools to make things happen, commonly these are going to be YAML. (We may use the last day of this section to dive a little into YAML) 另一个重要的点是了解如何与DevOps工具(Kasten K10, Terraform 又或是HCL)进行交互。这些就是我们所说的配置文件(config files)它就是帮助你与那些DevOps工具成功交互的东西通常它们会以YAML的格式出现。(我们可能会在本节的最后一天稍微讲解YAML)
## Did I just talk myself out of learning a programming language? ## 我只是自说自话而不是学习编程语言吗?
Most of the time or depending on the role, you will be helping engineering teams implement DevOps into their workflow, a lot of testing around the application and making sure that the workflow that is built aligns to those DevOps principles we mentioned over the first few days. But in reality, this is going to be a lot of the time troubleshooting an application performance issue or something along those lines. This comes back to my original point and reasoning, the programming language I need to know is the one that the code is written in? If their application is written in NodeJS it wont help much if you have a Go or Python badge. 大多数时候或根据担任的角色你会帮助工程团队将DevOps在他们的工作流程中实现。大量围绕应用程序的测试并确保被构建的工作流程符合我们前几天提到的那些DevOps原则。但实际上这个过程将有大量时间花费在寻找程序性能或类似的问题上。这回到了我最初的观点我需要了解那个被用于编写程序源码的编程语言吗如果那个应用程序是用NodeJS编写的而你熟悉的是Go或Python那么这些知识并不能帮助你很多。
## Why Go ## 为什么选Go
Why Golang is the next programming language for DevOps, Go has become a very popular programming language in recent years. According to the StackOverflow Survey for 2021 Go came in fourth for the most wanted Programming, scripting and markup languages with Python being top but hear me out. [StackOverflow 2021 Developer Survey Most Wanted Link](https://insights.stackoverflow.com/survey/2021#section-most-loved-dreaded-and-wanted-programming-scripting-and-markup-languages) 为什么Golang是DevOps的下一个编程语言Go已经成为近年非常流行的编程语言。根据2021年StackOverflow的调查Go在最受欢迎的编程、脚本和标记语言中排名第四其中Python位于榜首但也请继续看完。[StackOverflow 2021 Developer Survey Most Wanted Link](https://insights.stackoverflow.com/survey/2021#section-most-loved-dreaded-and-wanted-programming-scripting-and-markup-languages)
As I have also mentioned some of the most known DevOps tools and platforms are written in Go such as Kubernetes, Docker, Grafana and Prometheus. 正如我提到的那样一些最出名的DevOps工具和平台是用Go来编写的例如Kubernetes、Docker、Grafana和Prometheus。
What are some of the characteristics of Go that make it great for DevOps? 那么Go具备哪些适合DevOps的特性呢
## Build and Deployment of Go Programs ## Go的构建和部署
An advantage of using a language like Python that is interpreted in a DevOps role is that you dont need to compile a python program before running it. Especially for smaller automation tasks, you dont want to be slowed down by a build process that requires compilation even though, Go is a compiled programming language, **Go compiles directly into machine code**. Go is known also for fast compilation times.
## Go vs Python for DevOps 一个优势是就像Python那样具备解释性并且在DevOps工作中你无需在运行程序之前进行编译。特别是对于小规模的自动化任务你不希望在构建的过程中被编译流程拖后腿。Go是一个编译性的编程语言**Go直接完成编译变成机器码**。Go也是出了名的编译速度快。
Go Programs are statically linked, this means that when you compile a go program everything is included in a single binary executable, no external dependencies will be required that would need to be installed on the remote machine, this makes the deployment of go programs easy, compared to python program that uses external libraries you have to make sure that all those libraries are installed on the remote machine that you wish to run on. ## DevOps的Go vs Python
Go is a platform-independent language, which means you can produce binary executables for *all the operating systems, Linux, Windows, macOS etc and very easy to do so. With Python, it is not as easy to create these binary executables for particular operating systems. Go程序是静态链接的这意味着当你编译一个go程序时所有的东西都会被放在一个二进制执行文件里并且不需要在远程机器上安装外部依赖。对比在运行使用了外部库的Python程序时它需要确保所用到的库都已安装在这台远程计算机上这一特点让go程序的部署变得简单。
Go is a very performant language, it has fast compilation and fast run time with lower resource usage like CPU and memory especially compared to python, numerous optimisations have been implemented in the Go language that makes it so performant. (Resources below) Go是一种独立于平台的语言,这意味着你可以很轻松地为\*所有操作系统 Linux、Windows、macOS等等生成二进制可执行文件。而对于Python来说为特定操作系统制作二进制可执行文件就没那么简单了。
Unlike Python which often requires the use of third party libraries to implement a particular python program, go includes a standard library that has the majority of functionality that you would need for DevOps built directly into it. This includes functionality file processing, HTTP web services, JSON processing, native support for concurrency and parallelism as well as built-in testing. Go是一个具备非常高性能的语言它可以快速完成编译并且比Python占用更少的CPU、内存等资源。许多优化已经被应用于Go语言中使其能达到高性能。(详见文末Resources)
This is by no way throwing Python under the bus I am just giving my reasons for choosing Go but they are not the above Go vs Python it's generally because it makes sense as the company I work for develops software in Go so that is why. 与常需要使用第三方库来实现特定程序的Python不同Go包含了一个标准库其中有DevOps所需的大部分功能。包括文件处理功能、HTTP Web服务、JSON处理、对并发和并行的本机支持和内置测试。
I will say that once you have or at least I am told as I am not many pages into this chapter right now, is that once you learn your first programming language it becomes easier to take on other languages. You're probably never going to have a single job in any company anywhere where you don't have to deal with manage, architect, orchestrating, debug JavaScript and Node JS applications. 这篇文章不是让你放弃Python我只是给出我自己选择Go的理由。但这些理由不一定是上面提到的内容通常是因为我工作中用到了Go开发软件所以这是我的原因。
## Resources 我会说一旦你学习了你的第一门编程语言学习其他的语言将变得更简单。你可能不会永远只在一家公司里做一个岗位的工作你很有可能会接触到管理、架构、编排、调试JavaScript和NodeJS的应用程序。
## 相关资料
- [StackOverflow 2021 Developer Survey](https://insights.stackoverflow.com/survey/2021) - [StackOverflow 2021 Developer Survey](https://insights.stackoverflow.com/survey/2021)
- [Why we are choosing Golang to learn](https://www.youtube.com/watch?v=7pLqIIAqZD4&t=9s) - [Why we are choosing Golang to learn](https://www.youtube.com/watch?v=7pLqIIAqZD4&t=9s)
@ -64,6 +65,6 @@ I will say that once you have or at least I am told as I am not many pages into
- [FreeCodeCamp - Learn Go Programming - Golang Tutorial for Beginners](https://www.youtube.com/watch?v=YS4e4q9oBaU&t=1025s) - [FreeCodeCamp - Learn Go Programming - Golang Tutorial for Beginners](https://www.youtube.com/watch?v=YS4e4q9oBaU&t=1025s)
- [Hitesh Choudhary - Complete playlist](https://www.youtube.com/playlist?list=PLRAV69dS1uWSR89FRQGZ6q9BR2b44Tr9N) - [Hitesh Choudhary - Complete playlist](https://www.youtube.com/playlist?list=PLRAV69dS1uWSR89FRQGZ6q9BR2b44Tr9N)
Now for the next 6 days of this topic my intention is to work through some of the resources listed above and document my notes for each day. You will notice that they are generally around 3 hours as a full course, I wanted to share my complete list so that if you have time you should move ahead and work through each one if time permits, I will be sticking to my learning hour each day. 在未来的6天里我打算通过上述资料来帮助我组织每天的笔记。你会注意到它们作为一整个课程通常需要3个小时来了解。我想分享我的完成列表如果时间允许你应该去每个链接看看我也会每天坚持学习。
See you on [Day 8](day08.md). 让我们[第八天](day08.md)再见。

View File

@ -1,67 +1,67 @@
--- ---
title: '#90DaysOfDevOps - Setting up your DevOps environment for Go & Hello World - Day 8' title: '#90DaysOfDevOps - 配置 Go 语言的 DevOps 环境 & Hello World - 第八天'
published: false published: false
description: 90DaysOfDevOps - Setting up your DevOps environment for Go & Hello World description: 90DaysOfDevOps - 配置 Go 语言的 DevOps 环境 & Hello World
tags: 'devops, 90daysofdevops, learning' tags: 'devops, 90daysofdevops, learning'
cover_image: null cover_image: null
canonical_url: null canonical_url: null
id: 1048857 id: 1048857
--- ---
## Setting up your DevOps environment for Go & Hello World ## 配置Go语言的DevOps环境 & Hello World
Before we get into some of the fundamentals of Go we should get Go installed on our workstation and do what every "learning programming 101" module teaches us which is to create the Hello World app. As this one is going to be walking through the steps to get Go installed on your workstation we are going to attempt to document the process in pictures so people can easily follow along. 在我们开始学习Go的基础知识之前我们要在我们的工作设备上安装Go并按照“学习编程101”的教程来创建Hello World程序。这一小节将一步步在你的机器上完成Go的安装我们会使用截图来记录整个过程从而让大家能更好地跟进。
First of all, let's head on over to [go.dev/dl](https://go.dev/dl/) and you will be greeted with some available options for downloads. 首先,前往[go.dev/dl](https://go.dev/dl/),你会看到一些可供下载的选项。
![](Images/Day8_Go1.png) ![](../../Days/Images/Day8_Go1.png)
If we made it this far you probably know which workstation operating system you are running so select the appropriate download and then we can get installing. I am using Windows for this walkthrough, basically, from this next screen, we can leave all the defaults in place for now. ***(I will note that at the time of writing this was the latest version so screenshots might be out of date)*** 如果你已经知道了你的工作设备运行的是哪个操作系统选择对应的下载选项然后我们就可以开始安装了。在这个演示中我使用的是Windows。基本的从下面的截屏开始我们可以保留所有默认设置。***我在撰写这里的时候这是最新版本,所以截图可能已经过时***
![](Images/Day8_Go2.png) ![](../../Days/Images/Day8_Go2.png)
Also note if you do have an older version of Go installed you will have to remove this before installing, Windows has this built into the installer and will remove and install as one. 另外需要注意如果你安装了较旧版本的Go你应该在安装新版前将它卸载掉。Windows已将它内置到安装程序中并会作为一个整体来删除和安装。
Once finished you should now open a command prompt/terminal and we want to check that we have Go installed. If you do not get the output that we see below then Go is not installed and you will need to retrace your steps. 完成后,你现在应该打开命令提示符/终端我们检查一下是否已安装Go。如果你没有下图的输出那么Go没有安装成功你需要重新进行刚才的步骤。
`go version` `go version`
![](Images/Day8_Go3.png) ![](../../Days/Images/Day8_Go3.png)
Next up we want to check our environment for Go. This is always good to check to make sure your working directories are configured correctly, as you can see below we need to make sure you have the following directory on your system. 接下来我们检查一下Go的环境。这样的检查可以很好地确认你的工作目录配置是正确的你可以看到下图的信息我们需要确保这些地址存在于你的系统中。
![](Images/Day8_Go4.png) ![](../../Days/Images/Day8_Go4.png)
Did you check? Are you following along? You will probably get something like the below if you try and navigate there. 已经检查完了吗?有跟着去操作嘛?如果你尝试去到那里,你很可能会得到类似下图的内容。
![](Images/Day8_Go5.png) ![](../../Days/Images/Day8_Go5.png)
Ok, let's create that directory for ease I am going to use the mkdir command in my powershell terminal. We also need to create 3 folders within the Go folder as you will see also below. OK为了方便起见我们创建新的目录我将在powershell终端中输入mkdir命令。我们还需要在Go文件夹中新建三个文件夹如下图所示。
![](Images/Day8_Go6.png) ![](../../Days/Images/Day8_Go6.png)
Now we have Go installed and we have our Go working directory ready for action. We now need an integrated development environment (IDE) Now there are many out there available that you can use but the most common and the one I use is Visual Studio Code or Code. You can learn more about IDEs [here](https://www.youtube.com/watch?v=vUn5akOlFXQ). 现在我们已经安装好Go并且准备好了我们的Go工作目录。下一步我们需要一个集成开发环境(IDE)。如今已有许多的选择其中一个最常见的也是我使用的是Visual Studio Code或被称为Code。你可以在[这里](https://www.youtube.com/watch?v=vUn5akOlFXQ)了解更多有关IDEs的信息。
If you have not downloaded and installed VSCode already on your workstation then you can do so by heading [here](https://code.visualstudio.com/download). As you can see below you have your different OS options. 如果你还没有在工作设备上下载并安装VSCode那么你可以访问[这里](https://code.visualstudio.com/download)。你会看到如下图中提供了不同操作系统的选项。
![](Images/Day8_Go7.png) ![](../../Days/Images/Day8_Go7.png)
Much the same as with the Go installation we are going to download and install and keep the defaults. Once complete you can open VSCode and you can select Open File and navigate to our Go directory that we created above. 与Go的安装类似的我们会下载并按照默认设置进行安装。完成后你可以打开VSCode然后选择Open File定位到我们之前创建的Go目录。
![](Images/Day8_Go8.png) ![](../../Days/Images/Day8_Go8.png)
You may get a popup about trust, read it if you want and then hit Yes, trust the authors. (I am not responsible later on though if you start opening things you don't trust!) 你会看到一个关于信任(trust)的弹窗如果你感兴趣可以阅读一下然后点击Yes, trust the authors。(如果你打开了自己不信任的东西,我将不负责!)
Now you should see the three folders we also created earlier as well and what we want to do now is right click the src folder and create a new folder called `Hello` 现在你应该会看到之前创建的三个文件夹右键单击sr文件夹并创建一个名为`Hello`的文件夹。
![](Images/Day8_Go9.png) ![](../../Days/Images/Day8_Go9.png)
Pretty easy stuff I would say up till this point? Now we are going to create our first Go Program with no understanding about anything we put in this next phase. 到目前为止的操作应该还是很简单的吧然后哦我们要新建我们的第一个Go程序虽然我们还不理解在下一步里边写的是什么。
Next create a file called `main.go` in your `Hello` folder. As soon as you hit enter on the main.go you will be asked if you want to install the Go extension and also packages you can also check that empty pkg file that we made a few steps back and notice that we should have some new packages in there now? 下一步,在`Hello`文件夹中新建名为`main.go`的文件。一旦你在main.go上按下Enter键你将会被询问是否要安装Go的拓展程序和库。你也可以检查前几步中的空文件夹pkg并发现我们已经添加了一些库在里边
![](Images/Day8_Go10.png) ![](../../Days/Images/Day8_Go10.png)
Now let's get this Hello World app going, copy for the following code into your new main.go file and save that. 将下边的代码复制到main.go中并保存然后运行这个Hello World。
``` ```
package main package main
@ -72,26 +72,28 @@ func main() {
fmt.Println("Hello #90DaysOfDevOps") fmt.Println("Hello #90DaysOfDevOps")
} }
``` ```
Now I appreciate that the above might make no sense at all, but we will cover more about functions, packages and more in later days. For now let's run our app. Back in the terminal and in our Hello folder we can now check that all is working. Using the command below we can check to see if our generic learning program is working. 我承认上边的内容好像并没有什么意义,但我们将在未来介绍更多关于函数(functions)、库(packages)等等的内容。先把我们的程序运行起来。回到终端在Hello文件夹中我们可以检查一切是否正常。如果我们的程序一切正常输入下边的命令将显示预期的结果。
``` ```
go run main.go go run main.go
``` ```
![](Images/Day8_Go11.png) ![](../../Days/Images/Day8_Go11.png)
It doesn't end there though, what if we now want to take our program and run it on other Windows machines? We can do that by building our binary using the following command 到这里还没有完如果我们想让程序在另一台Windows机器上运行该如何操作呢我们可以通过build来构建二进制文件从而做到这一点。
``` ```
go build main.go go build main.go
``` ```
![](Images/Day8_Go12.png) ![](../../Days/Images/Day8_Go12.png)
If we run that 当我们运行它时,可以看到相同的输出:
![](Images/Day8_Go13.png) ```bash
$ ./main.exe
Hello #90DaysOfDevOps
```
## 相关资料
## Resources
- [StackOverflow 2021 Developer Survey](https://insights.stackoverflow.com/survey/2021) - [StackOverflow 2021 Developer Survey](https://insights.stackoverflow.com/survey/2021)
- [Why we are choosing Golang to learn](https://www.youtube.com/watch?v=7pLqIIAqZD4&t=9s) - [Why we are choosing Golang to learn](https://www.youtube.com/watch?v=7pLqIIAqZD4&t=9s)
@ -102,6 +104,6 @@ If we run that
- [Hitesh Choudhary - Complete playlist](https://www.youtube.com/playlist?list=PLRAV69dS1uWSR89FRQGZ6q9BR2b44Tr9N) - [Hitesh Choudhary - Complete playlist](https://www.youtube.com/playlist?list=PLRAV69dS1uWSR89FRQGZ6q9BR2b44Tr9N)
See you on [Day 9](day09.md). 让我们[第九天](day09.md)见!
![](Images/Day8_Go13.png) ![](../../Days/Images/Day8_Go13.png)

View File

@ -1,71 +1,75 @@
--- ---
title: '#90DaysOfDevOps - Let''s explain the Hello World code - Day 9' title: '#90DaysOfDevOps - 解释 Hello World 代码 - 第九天'
published: false published: false
description: 90DaysOfDevOps - Let's explain the Hello World code description: 90DaysOfDevOps - 解释 Hello World 代码
tags: "devops, 90daysofdevops, learning" tags: "devops, 90daysofdevops, learning"
cover_image: null cover_image: null
canonical_url: null canonical_url: null
id: 1048732 id: 1048732
--- ---
## Let's explain the Hello World code ## 解释 Hello World 代码
### How Go works ### Go是如何工作的
On [Day 8](day08.md) we walked through getting Go installed on your workstation and we then created our first Go application. 在[第八天](day08.md)我们完成了Go在工作站上的安装同时创建了第一个Go程序。
In this section, we are going to take a deeper look into the code and understand a few more things about the Go language. 在这节我们将更深入学习这些代码了解更多Go语言的内容。
### What is Compiling? ### 什么是编译?
Before we get into the [6 lines of the Hello World code](Go/hello.go) we need to have a bit of an understanding about compiling.
Programming languages that we commonly use such as Python, Java, Go and C++ are high-level languages. Meaning they are human-readable but when a machine is trying to execute a program it needs to be in a form that a machine can understand. We have to translate our human-readable code to machine code which is called compiling. 在开始[Hello World的6行代码](Go/hello.go)之前,我们需要对编译有一些了解。
![](Images/Day9_Go1.png) 像Python、Java、Go和C++这类我们常用的编程语言都是高级语言(high-level languages)。这意味着它们是可以被人们直接阅读的。但当机器想去执行一个程序时,它需要将这些语言转换成机器能读懂的形式。而这个将人类能读懂的代码翻译成机器码的过程被成为编译(compiling)。
From the above you can see what we did on [Day 8](day08.md) here, we created a simple Hello World main.go and we then used the command `go build main.go` to compile our executable. ![](../../Days/Images/Day9_Go1.png)
### What are packages? 从上图可以看到,我们在[第八天](day08.md)做的事情。我们创建了一个简单的Hello World程序main.go然后使用`go build main.go`命令去将它编译成可执行文件。
A package is a collection of source files in the same directory that are compiled together. We can simplify this further, a package is a bunch of .go files in the same directory. Remember our Hello folder from Day 8? If and when you get into more complex Go programs you might find that you have folder1 folder2 and folder3 containing different .go files that make up your program with multiple packages.
We use packages so we can reuse other peoples code, we don't have to write everything from scratch. Maybe we are wanting a calculator as part of our program, you could probably find an existing Go Package that contains the mathematical functions that you could import into your code saving you a lot of time and effort in the long run. ### 什么是包(packages)
Go encourages you to organise your code in packages so that it is easy to reuse and maintain source code. 包(package),是在同一目录下编译到一起的源文件的集合。我们可以简单理解为,一个包是同一目录下的一堆\*.go文件。还记得第八天的Hello文件夹吗当你去学习更复杂的Go程序时你可能会发现你有多个包含.go文件的文件夹(folder1, folder2和folder3...),所以你的程序是由多个包构成的。
### Hello #90DaysOfDevOps Line by Line 因此我们可以通过使用包来复用别人的代码并且不必将所有东西从头开始写。或许我们希望程序中有一个计算功能你可能找到一个现成的、可以被导入的、包含了数学计算的Go包。长远来看这种做法可以为你节省大量时间和精力。
Now let's take a look at our Hello #90DaysOfDevOps main.go file and walk through the lines.
![](Images/Day9_Go2.png) Go鼓励你将代码组织成不同的包从而方便后续复用和维护。
In the first line, you have `package main` which means that this file belongs to a package called main. All .go files need to belong to a package, they should also have `package something` in the opening line. ### 逐行来看Hello \#90DaysOfDevOps
A package can be named whatever you wish. We have to call this `main` as this is the starting point of the program that is going to be in this package, this is a rule. (I need to understand more about this rule?) 让我们去到Hello文件夹里的main.go文件并看看每行在做什么。
![](Images/Day9_Go3.png) ![](../../Days/Images/Day9_Go2.png)
Whenever we want to compile and execute our code we have to tell the machine where the execution needs to start. We do this by writing a function called main. The machine will look for a function called main to find the entry point of the program. 在第一行,`package main`声明了这个文件属于一个叫main的包。所有的.go文件都需要被归属到一个包里它们的开头会有`package something`的字样。
A function is a block of code that can do some specific task for and can be used across the program. 一个包可以任意命名。我们称`main`为这个包中的程序的起点,这是一个规则。(我需要更多地了解这条规则吗?)
You can declare a function with any name using `func` but in this case we need to name it `main` as this is where the code starts. ![](../../Days/Images/Day9_Go3.png)
![](Images/Day9_Go4.png) 每当我们想要编译并运行代码时我们都要告诉机器从那里开始执行。于是我们会编写一个叫main的函数。机器将会寻找名叫main的函数作为程序的入口。
Next we are going to look at line 3 of our code, the import, this basically means you want to bring in another package to your main program. fmt is a standard package being used here provided by Go, this package contains the `Println()`function and because we have imported this we can use this in line 6. There are a number of standard packages you can include in your program and leverage or reuse them in your code saving you the hassle of having to write from scratch. [Go Standard Library](https://pkg.go.dev/std) 函数是一个可以在整个程序中使用的、执行特定任务的代码块。
![](Images/Day9_Go5.png) 你可以使用`func`声明任意名称的函数,只是在我们的例子中,我们需要命名`main`作为代码的起始点。
the `Println()` that we have here is a way in which to write to a standard output to the terminal where ever the executuable has been executed succesfully. Feel free to change the message in between the (). ![](../../Days/Images/Day9_Go4.png)
![](Images/Day9_Go6.png) 接下来到第3行import意味着你要在main程序里引入另一个包。fmt是Go提供的一个标准包里边包含了将在第六行使用的`Println()`函数。也有很多标准库可以被应用在你的程序中,利用或重用它们可以省去你从头写起的麻烦。[Go的标准包](https://pkg.go.dev/st)
### TLDR ![](../../Days/Images/Day9_Go5.png)
- **Line 1** = This file will be in the package called `main` and this needs to be called `main` because includes the entry point of the program. 这里的`Println()`是一个将标准输出写入终端的一个方法,当程序被成功执行,信息将会被打印出来。你可以随意更改括号里的内容。
- **Line 3** = For us to use the `Println()` we have to import the fmt package to use this on line 6.
- **Line 5** = The actual starting point, its the `main` function.
- **Line 6** = This will let us print "Hello #90DaysOfDevOps" on our system.
## Resources ![](../../Days/Images/Day9_Go6.png)
### 总结
- **第1行** = 这个文件位于名为`main`的包中,因为其中包含程序的入口`main`,所以被称为`main`。
- **第3行** = 我们要导入fmt包后再在第六行使用`Println()`。
- **第5行** = 实际的入口,`main`函数。
- **第6行** = 这将在我们的系统上打印"Hello #90DaysOfDevOps"。
## 相关资料
- [StackOverflow 2021 Developer Survey](https://insights.stackoverflow.com/survey/2021) - [StackOverflow 2021 Developer Survey](https://insights.stackoverflow.com/survey/2021)
- [Why we are choosing Golang to learn](https://www.youtube.com/watch?v=7pLqIIAqZD4&t=9s) - [Why we are choosing Golang to learn](https://www.youtube.com/watch?v=7pLqIIAqZD4&t=9s)
@ -75,4 +79,4 @@ the `Println()` that we have here is a way in which to write to a standard outpu
- [FreeCodeCamp - Learn Go Programming - Golang Tutorial for Beginners](https://www.youtube.com/watch?v=YS4e4q9oBaU&t=1025s) - [FreeCodeCamp - Learn Go Programming - Golang Tutorial for Beginners](https://www.youtube.com/watch?v=YS4e4q9oBaU&t=1025s)
- [Hitesh Choudhary - Complete playlist](https://www.youtube.com/playlist?list=PLRAV69dS1uWSR89FRQGZ6q9BR2b44Tr9N) - [Hitesh Choudhary - Complete playlist](https://www.youtube.com/playlist?list=PLRAV69dS1uWSR89FRQGZ6q9BR2b44Tr9N)
See you on [Day 10](day10.md). [第十天](day10.md)见。

View File

@ -1,87 +1,95 @@
--- ---
title: '#90DaysOfDevOps - The Go Workspace - Day 10' title: '#90DaysOfDevOps - Go 工作空间及编译运行 - 第十天'
published: false published: false
description: 90DaysOfDevOps - The Go Workspace description: 90DaysOfDevOps - Go 工作空间及编译运行
tags: "devops, 90daysofdevops, learning" tags: "devops, 90daysofdevops, learning"
cover_image: null cover_image: null
canonical_url: null canonical_url: null
id: 1048701 id: 1048701
--- ---
### The Go Workspace
On [Day 8](day08.md) we briefly covered the Go workspace to get Go up and running to get to the demo of `Hello #90DaysOfDevOps` But we should explain a little more about the Go workspace.
Remember we chose the defaults and we then went through and created our Go folder in the GOPATH that was already defined but in reality, this GOPATH can be changed to be wherever you want it to be. ### Go工作区
If you run 在[第8天](day08.md)我们简单介绍了Go工作区并启动和运行了`Hello #90DaysOfDevOps`的演示。但我们应该更多地谈一下Go工作区(workspace)。
还记得我们选用了默认值然后在已定义好的GOPATH中创建了我们的Go文件夹。实际上GOPATH可以被更改为任意位置。
如果你运行
``` ```
echo $GOPATH echo $GOPATH
``` ```
The output should be similar to mine (with a different username may be) which is:
输出应该和我的类似(可能用户名会不同)
``` ```
/home/michael/projects/go /home/michael/projects/go
``` ```
Then within here, we created 3 directories. **src**, **pkg** and **bin**
![](Images/Day10_Go1.png) 然后我们在这个路径下创建了三个文件夹,**src**、**pkg**和**bin**。
**src** is where all of your Go programs and projects are stored. This handles namespacing package management for all your Go repositories. This is where you will see on our workstation we have our Hello folder for the Hello #90DaysOfDevOps project. ![](../../Days/Images/Day10_Go1.png)
![](Images/Day10_Go2.png) **src**是存放所有Go程序和项目的地方。这可以处理帮助你解决所有Go存储库的管理。下图是我工作站上的内容我们有存放Hello #90DaysOfDevOps这个项目的Hello文件夹
**pkg** is where your archived files of packages that are or were installed in programs. This helps to speed up the compiling process based on if the packages being used have been modified. ![](../../Days/Images/Day10_Go2.png)
![](Images/Day10_Go3.png) **pkg**是你存放安装或已安装程序包的地方。这有助于根据正使用的包是否被修改来加快编译速度。
**bin** is where all of your compiled binaries are stored. ![](../../Days/Images/Day10_Go3.png)
![](Images/Day10_Go4.png) **bin**是存放所有已编译二进制文件的地方。
Our Hello #90DaysOfDevOps is not a complex program so here is an example of a more complex Go Program taken from another great resource worth looking at [GoChronicles](https://gochronicles.com/) ![](../../Days/Images/Day10_Go4.png)
![](Images/Day10_Go5.png) 我们的Hello #90DaysOfDevOps不是一个复杂的程序。下面是一个较复杂的Go程序演示取自另一个值得看的资源[GoChronicles](https://gochronicles.com/)
This page also goes into some great detail about why and how the layout is like this it also goes a little deeper on other folders we have not mentioned [GoChronicles](https://gochronicles.com/project-structure/) ![](../../Days/Images/Day10_Go5.png)
### Compiling & running code 在[GoChronicles](https://gochronicles.com/project-structure/)中还详细介绍了布局的原因和方式,并在我们未提及的其他文件夹方面做出了更深入的介绍。
On [Day 9](day09.md) we also covered a brief introduction to compiling code, but we can go a little deeper here.
To run our code we first must **compile** it. There are three ways to do this within Go. ### 编译 & 运行代码
在[第9天](day09.md),我们简单介绍了代码编译,这里我们可以更深入了解一下。
想要运行我们的代码,第一步必须是**编译**它。在Go中有三种方式
- go build - go build
- go install - go install
- go run - go run
Before we get to the above compile stage we need to take a look at what we get with the Go Installation. 在我们讲解以上的编译方式之前我们需要了解Go安装了什么。
When we installed Go on Day 8 this installed something known as Go tools which consist of several programs that let us build and process our Go source files. One of the tools is `Go` 我们在第8天安装Go时它安装了被称为Go工具的东西。其中包括了一些程序让我们构建和处理我们的Go源文件。其中一个工具是`Go`。
It is worth noting that you can install additional tools that are not in the standard Go installation. 值得注意的是你可以安装标准Go安装以外的工具。
If you open your command prompt and type `go` you should see something like the image below and then you will see "Additional Help Topics" below that for now we don't need to worry about those. 如果你打开打开命令提示符,然后输入`go`,你会看到一些类似下图的内容。然后那你可以看到下边的“其他帮助主题”,但现在我们不需要去担心这些。
![](Images/Day10_Go6.png) ![](../../Days/Images/Day10_Go6.png)
You might also remember that we have already used at least two of these tools so far on Day 8. 你可能还记得我们已经在第8天中使用了至少其中两个工具。
![](Images/Day10_Go7.png) ![](../../Days/Images/Day10_Go7.png)
The ones we want to learn more about are build, install and run. 其中,我们将了解更多构建、安装和运行的内容(build, install and run)。
![](Images/Day10_Go8.png) ![](../../Days/Images/Day10_Go8.png)
- `go run` - This command compiles and runs the main package comprised of the .go files specified on the command line. The command is compiled to a temporary folder. - `go run` - 编译和运行命令行中指定的.go文件组成的main包。这个命令会被编译到一个临时文件夹里。
- `go build` - To compile packages and dependencies, compile the package in the current directory. If the `main` package, will place the executable in the current directory if not then it will place the executable in the `pkg` folder. `go build` also enables you to build an executable file for any Go Supported OS platform. - `go build` - 为了编译包和依赖,在当前文件夹中编译这个包。如果是`main`包,则会把可执行文件放在当前目录中;如果不是,则会把可执行文件放在`pkg`文件夹中。`go build`也能为任何Go支持的操作系统平台构建可执行文件。
- `go install` - The same as go build but will place the executable in the `bin` folder - `go install` - 与`go build`相同,但它会把可执行文件存放在`bin`文件夹中。
We have run through go build and go run but feel free to run through them again here if you wish, `go install` as stated above puts the executable in our bin folder.
![](Images/Day10_Go9.png) 我们已经运行了`go build`和`go run`,如果你想的话,可以在这里多次循行它们。就如前面说到的,`go install`将可执行文件放在我们的bin文件夹中。
Hopefully, if you are following along you are watching one of the playlists or videos below, I am taking bits of all of these and translating these into my notes so that I can understand the foundational knowledge of the Golang language. The resources below are likely going to give you a much better understanding of a lot of the areas you need overall but I am trying to document the 7 days or 7 hours worth of the journey with interesting things that I have found.
## Resources ![](../../Days/Images/Day10_Go9.png)
如果你正在观看下面的视频我会收集这些内容并记录在我的笔记中方便我理解Golang的基础知识。下面的资源能帮助你更好地理解你可能需要的领域而我想基于我搜集到的资料在记录这7天或7个小时的学习分享。
## 相关资料
- [StackOverflow 2021 Developer Survey](https://insights.stackoverflow.com/survey/2021) - [StackOverflow 2021 Developer Survey](https://insights.stackoverflow.com/survey/2021)
- [Why we are choosing Golang to learn](https://www.youtube.com/watch?v=7pLqIIAqZD4&t=9s) - [Why we are choosing Golang to learn](https://www.youtube.com/watch?v=7pLqIIAqZD4&t=9s)
@ -91,4 +99,4 @@ Hopefully, if you are following along you are watching one of the playlists or v
- [FreeCodeCamp - Learn Go Programming - Golang Tutorial for Beginners](https://www.youtube.com/watch?v=YS4e4q9oBaU&t=1025s) - [FreeCodeCamp - Learn Go Programming - Golang Tutorial for Beginners](https://www.youtube.com/watch?v=YS4e4q9oBaU&t=1025s)
- [Hitesh Choudhary - Complete playlist](https://www.youtube.com/playlist?list=PLRAV69dS1uWSR89FRQGZ6q9BR2b44Tr9N) - [Hitesh Choudhary - Complete playlist](https://www.youtube.com/playlist?list=PLRAV69dS1uWSR89FRQGZ6q9BR2b44Tr9N)
See you on [Day 11](day11.md). [第十一天](day11.md)见。

View File

@ -7,63 +7,64 @@ cover_image: null
canonical_url: null canonical_url: null
id: 1048732 id: 1048732
--- ---
## Let's explain the Hello World code ## 讓我們解釋一下 Hello World 編碼
### How Go works ### GO程式語言如何運作
On [Day 8](day08.md) we walked through getting Go installed on your workstation and we then created our first Go application. 在第 8 天,我們瀏覽了您在工作站上安裝 Go 的過程,然後我們創建了我們第一個 Go的應用程序。
In this section, we are going to take a deeper look into the code and understand a few more things about the Go language. 在本章節中,我們將更深入地研究代碼並了解更多關於 Go 程式語言的內容。
### What is Compiling? ### 什麼是編譯?
Before we get into the [6 lines of the Hello World code](Go/hello.go) we need to have a bit of an understanding about compiling. 在我們進入第6行Hello World代碼之前[第6行Hello World編碼](Go/hello.go),我們必需要對編譯有一點了解。
Programming languages that we commonly use such as Python, Java, Go and C++ are high-level languages. Meaning they are human-readable but when a machine is trying to execute a program it needs to be in a form that a machine can understand. We have to translate our human-readable code to machine code which is called compiling. 像我們常用的Python、 Java、Go和C++編程語言都是高階的程試語言。
這意味著它們是人類可辨別的,但是當機器嘗試執行程序時,它需要採用機器可以理解的形式。我們必須將人類可辨別的代碼翻譯成為機器代碼這就稱為編譯。
![](Images/Day9_Go1.png) ![](Images/Day9_Go1.png)
From the above you can see what we did on [Day 8](day08.md) here, we created a simple Hello World main.go and we then used the command `go build main.go` to compile our executable. 從上面你可以看到我們在第 8 天做了什麼 [第8天](day08.md),我們創建了一個簡單的 Hello World main.go檔然後我們使用指令 `go build main.go` 來編譯我們可執行的檔案。
### What are packages? ### 什麼是套件?
A package is a collection of source files in the same directory that are compiled together. We can simplify this further, a package is a bunch of .go files in the same directory. Remember our Hello folder from Day 8? If and when you get into more complex Go programs you might find that you have folder1 folder2 and folder3 containing different .go files that make up your program with multiple packages. 套件是在同一目錄中收藏的源碼檔所形成的編譯。我們可以進一步簡化這一點,一個套件是在同一個目錄底下的一堆.go檔案。還記得第 8 天的 Hello 文件檔嗎?如果當您進入更複雜的 Go 程式語言時您可能會發現你有文件1、文件2、文件3與許多套件所編輯而成的數個.go檔案。
We use packages so we can reuse other peoples code, we don't have to write everything from scratch. Maybe we are wanting a calculator as part of our program, you could probably find an existing Go Package that contains the mathematical functions that you could import into your code saving you a lot of time and effort in the long run. 我們使用套件所以我們可以重複使用其他人的代碼,我們不必從頭開始編寫所有東西。或許我們想擁有一個從長遠來看可以為你節省大量時間與精力的計算器作為我們編程的一部分,在此你可能會找到現有的一個 Go 套件裡,包含你可以導入到代碼中的數學函數。
Go encourages you to organise your code in packages so that it is easy to reuse and maintain source code. Go 程式語言鼓勵您將代碼統整在套件中,以便於重新使用和維護源代碼。
### Hello #90DaysOfDevOps Line by Line ### Hello #90DaysOfDevOps 並行
Now let's take a look at our Hello #90DaysOfDevOps main.go file and walk through the lines. 現在讓我們看一下我們的 Hello #90DaysOfDevOps main.go檔並逐行查看。
![](Images/Day9_Go2.png) ![](Images/Day9_Go2.png)
In the first line, you have `package main` which means that this file belongs to a package called main. All .go files need to belong to a package, they should also have `package something` in the opening line. 在第一行,你有'main套件'這意味著這個檔案附屬於一個稱作main的資料包。所有.go檔案都必須隸屬於這個套件它們在初始行應該也有像套件的東西。
A package can be named whatever you wish. We have to call this `main` as this is the starting point of the program that is going to be in this package, this is a rule. (I need to understand more about this rule?) 一個套件可以任意命名。我們必須在編程的一開頭就下'main'的指令,此動作將會在此套件運行,這就是規則。(還有需要再更加了解的嗎?
![](Images/Day9_Go3.png) ![](Images/Day9_Go3.png)
Whenever we want to compile and execute our code we have to tell the machine where the execution needs to start. We do this by writing a function called main. The machine will look for a function called main to find the entry point of the program. 每當我們想要編譯和執行我們的代碼時,我們都必須告訴機器需要從哪裡開始執行。我們通過編寫一個名為 main 的函數來做到這一點。 機器將尋找一個名為 main 的函數作為編碼的切入點。
A function is a block of code that can do some specific task for and can be used across the program. 函數是一個可以執行某些特定任務的分組碼並且可以在整個程序使用。
You can declare a function with any name using `func` but in this case we need to name it `main` as this is where the code starts. 可以運用'func'呼叫任何一組函數的名稱。 但在此情況下,我們必須將它稱作為'main',因為這是編碼開始的地方。
![](Images/Day9_Go4.png) ![](Images/Day9_Go4.png)
Next we are going to look at line 3 of our code, the import, this basically means you want to bring in another package to your main program. fmt is a standard package being used here provided by Go, this package contains the `Println()`function and because we have imported this we can use this in line 6. There are a number of standard packages you can include in your program and leverage or reuse them in your code saving you the hassle of having to write from scratch. [Go Standard Library](https://pkg.go.dev/std) 接下來,我們將查看代碼的第 3 行意即導入基本上意味著您要在主編碼中引入另一個套件。fmt 是 Go 語言程式提供的一個標準套件,此套件包含'Println()'的函數,因為我們已經呼叫了它,所以我們可以在第 6 行使用。你可以在你的編程中列入數個標準套件並在代碼中利用或重新使用,從而省去了從頭開始編寫的麻煩。 [Go 語言程式標準庫](https://pkg.go.dev/std)
![](Images/Day9_Go5.png) ![](Images/Day9_Go5.png)
the `Println()` that we have here is a way in which to write to a standard output to the terminal where ever the executuable has been executed succesfully. Feel free to change the message in between the (). 我們在這裡使用的Println()是一種將標準輸出寫入終端的方式,在該終端中,可執行文件已成功執行。請隨意更改()之間的訊息。
![](Images/Day9_Go6.png) ![](Images/Day9_Go6.png)
### TLDR ### TLDR
- **Line 1** = This file will be in the package called `main` and this needs to be called `main` because includes the entry point of the program. - **第 1 行** = 該文件將位於名為 main的套件中並且需要將其稱為 main因為其中包含程式的切入口。
- **Line 3** = For us to use the `Println()` we have to import the fmt package to use this on line 6. - **第 3 行** = 為了讓我們使用 `Println()`,我們必須在第 6 行導入 fmt 套件包才能使用它。
- **Line 5** = The actual starting point, its the `main` function. - **第 5 行** = 實際的起點為'main'函數。
- **Line 6** = This will let us print "Hello #90DaysOfDevOps" on our system. - **第 6 行** = 這將讓我們在系統上呈現“Hello #90DaysOfDevOps”。
## Resources ## Resources
@ -75,4 +76,4 @@ the `Println()` that we have here is a way in which to write to a standard outpu
- [FreeCodeCamp - Learn Go Programming - Golang Tutorial for Beginners](https://www.youtube.com/watch?v=YS4e4q9oBaU&t=1025s) - [FreeCodeCamp - Learn Go Programming - Golang Tutorial for Beginners](https://www.youtube.com/watch?v=YS4e4q9oBaU&t=1025s)
- [Hitesh Choudhary - Complete playlist](https://www.youtube.com/playlist?list=PLRAV69dS1uWSR89FRQGZ6q9BR2b44Tr9N) - [Hitesh Choudhary - Complete playlist](https://www.youtube.com/playlist?list=PLRAV69dS1uWSR89FRQGZ6q9BR2b44Tr9N)
See you on [Day 10](day10.md). [第10天](day10.md)見.