diff --git a/.github/FUNDING.yml b/.github/FUNDING.yml new file mode 100644 index 0000000..4bad604 --- /dev/null +++ b/.github/FUNDING.yml @@ -0,0 +1,13 @@ +# These are supported funding model platforms + +github: [MichaelCade] +patreon: # Replace with a single Patreon username +open_collective: # Replace with a single Open Collective username +ko_fi: # michaelcade1 +tidelift: # Replace with a single Tidelift platform-name/package-name e.g., npm/babel +community_bridge: # Replace with a single Community Bridge project-name e.g., cloud-foundry +liberapay: # Replace with a single Liberapay username +issuehunt: # Replace with a single IssueHunt username +otechie: # Replace with a single Otechie username +lfx_crowdfunding: # Replace with a single LFX Crowdfunding project-name e.g., cloud-foundry +custom: # Replace with up to 4 custom sponsorship URLs e.g., ['link1', 'link2'] diff --git a/2022/Days/day11.md b/2022/Days/day11.md index 358569a..efadc9b 100644 --- a/2022/Days/day11.md +++ b/2022/Days/day11.md @@ -51,7 +51,7 @@ You will then see from the below that we built our code with the above example a ![](Images/Day11_Go1.png) -We also know that our challenge is 90 days at least for this challenge, but next, maybe it's 100 so we want to define a variable to help us here as well. However, for our program, we want to define this as a constant. Constants are like variables, except that their value cannot be changed within code (we can still create a new app later on down the line with this code and change this constant but this 90 will not change whilst we are running our application) +We also know that our challenge is 90 days at least for this challenge, but next, maybe it's 100 so we want to define a variable to help us here as well. However, for our program, we want to define this as a constant. Constants are like variables, except that their value cannot be changed within code (we can still create a new app later on down the line with this code and change this constant but this 90 will not change while we are running our application) Adding the `const` to our code and adding another line of code to print this. diff --git a/2023.md b/2023.md index 5f3fcf9..76a3839 100644 --- a/2023.md +++ b/2023.md @@ -16,14 +16,14 @@ Or contact us via Twitter, my handle is [@MichaelCade1](https://twitter.com/Mich ## List of Topics -| Topic | Author | Date | Twitter Handle | +| Topic | Author | Date | Twitter Handle | | -------------------------------------- | ----------------------------------- | ------------------- | ----------------------------------------------------------------------------------------------- | | DevSecOps | Michael Cade | 1st Jan - 6th Jan | [@MichaelCade1](https://twitter.com/MichaelCade1) | -| Secure Coding | Prateek Jain | 7th Jan - 13th Jan | [@PrateekJainDev](https://twitter.com/PrateekJainDev) | +| Secure Coding | Prateek Jain | 7th Jan - 13th Jan | [@PrateekJainDev](https://twitter.com/PrateekJainDev) | | Continuous Build, Integration, Testing | Anton Sankov and Svetlomir Balevski | 14th Jan - 20th Jan | [@a_sankov](https://twitter.com/a_sankov) | | Continuous Delivery & Deployment | Anton Sankov | 21st Jan - 27th Jan | [@a_sankov](https://twitter.com/a_sankov) | -| Runtime Defence & Monitoring | Ben Hirschberg | 28th Jan - 3rd Feb | [@slashben81](https://twitter.com/slashben81) | -| Secrets Management | Bryan Krausen | 4th Feb - 10th Feb | [@btkrausen](https://twitter.com/btkrausen) | +| Runtime Defence & Monitoring | Ben Hirschberg | 28th Jan - 3rd Feb | [@slashben81](https://twitter.com/slashben81) | +| Secrets Management | Bryan Krausen | 4th Feb - 10th Feb | [@btkrausen](https://twitter.com/btkrausen) | | Python | Rishab Kumar | 11th Feb - 17th Feb | [@rishabk7](https://twitter.com/rishabk7) | | AWS | Chris Williams | 18th Feb - 24th Feb | [@mistwire](https://twitter.com/mistwire) | | OpenShift | Dean Lewis | 25th Feb - 3rd Mar | [@saintdle](https://twitter.com/saintdle) | @@ -34,34 +34,34 @@ Or contact us via Twitter, my handle is [@MichaelCade1](https://twitter.com/Mich ## Progress -- [] ♾️ 1 > [2022 Reflection & Welcome 2023](2023/day01.md) +- [✔️] ♾️ 1 > [2022 Reflection & Welcome 2023](2023/day01.md) ### DevSecOps -- [] ♾️ 2 > [The Big Picture: DevSecOps](2023/day02.md) -- [] ♾️ 3 > [Think like an Attacker](2023/day03.md) -- [] ♾️ 4 > [Red Team vs. Blue Team](2023/day04.md) -- [] ♾️ 5 > [OpenSource Security](2023/day05.md) -- [] ♾️ 6 > [Hands-On: Building a weak app](2023/day06.md) +- [✔️] ♾️ 2 > [The Big Picture: DevSecOps](2023/day02.md) +- [✔️] ♾️ 3 > [Think like an Attacker](2023/day03.md) +- [✔️] ♾️ 4 > [Red Team vs. Blue Team](2023/day04.md) +- [✔️] ♾️ 5 > [OpenSource Security](2023/day05.md) +- [✔️] ♾️ 6 > [Hands-On: Building a weak app](2023/day06.md) ### Secure Coding -- [] ⌨️ 7 > [](2023/day07.md) -- [] ⌨️ 8 > [](2023/day08.md) -- [] ⌨️ 9 > [](2023/day09.md) -- [] ⌨️ 10 > [](2023/day10.md) -- [] ⌨️ 11 > [](2023/day11.md) -- [] ⌨️ 12 > [](2023/day12.md) -- [] ⌨️ 13 > [](2023/day13.md) +- [✔️] ⌨️ 7 > [Secure Coding Overview](2023/day07.md) +- [✔️] ⌨️ 8 > [SAST Overview](2023/day08.md) +- [✔️] ⌨️ 9 > [SAST Implementation with SonarCloud](2023/day09.md) +- [✔️] ⌨️ 10 > [Software Composition Analysis Overview](2023/day10.md) +- [✔️] ⌨️ 11 > [SCA Implementation with OWASP Dependency Check](2023/day11.md) +- [✔️] ⌨️ 12 > [Secure Coding Practices](2023/day12.md) +- [✔️] ⌨️ 13 > [Additional Secure Coding Practices](2023/day13.md) ### Continuous Build, Integration, Testing -- [] 🐧 14 > [](2023/day14.md) -- [] 🐧 15 > [](2023/day15.md) -- [] 🐧 16 > [](2023/day16.md) -- [] 🐧 17 > [](2023/day17.md) -- [] 🐧 18 > [](2023/day18.md) -- [] 🐧 19 > [](2023/day19.md) +- [✔️] 🐧 14 > [Container Image Scanning](2023/day14.md) +- [✔️] 🐧 15 > [Container Image Scanning Advanced](2023/day15.md) +- [✔️] 🐧 16 > [Fuzzing](2023/day16.md) +- [✔️] 🐧 17 > [Fuzzing Advanced](2023/day17.md) +- [✔️] 🐧 18 > [DAST](2023/day18.md) +- [✔️] 🐧 19 > [IAST](2023/day19.md) - [] 🐧 20 > [](2023/day20.md) ### Continuous Delivery & Deployment diff --git a/2023/day06.md b/2023/day06.md index f888dde..3449aec 100644 --- a/2023/day06.md +++ b/2023/day06.md @@ -2,3 +2,243 @@ Nobody really sets out to build a weak or vulnerable app... do they? +No is the correct answer, nobody should or does set out to build a weak application, and nobody intends on using packages or other open-source software that brings its own vulnerabilities. + +In this final introduction section into DevSecOps, I want to attempt to build and raise awareness of some of the misconfigurations and weaknesses that might fall by the wayside. Then later over the next 84 days or even sooner we are going to hear from some subject matter experts in the security space on how to prevent bad things and weak applications from being created. + +### Building our first weak application + +**Important Message: This exercise is to highlight bad and weaknesses in an application, Please do try this at home but beware this is bad practice** + +At this stage, I am not going to run through my software development environment in any detail. I would generally be using VScode on Windows with WSL2 enabled. We might then use Vagrant to provision dedicated compute instances to VirtualBox all of which I covered throughout the 2022 sections of #90DaysOfDevOps mostly in the Linux section. + +### Bad Coding Practices or Coding Bad Practices + +It is very easy to copy and paste into GitHub! + +How many people check end-to-end the package that they include in your code? + +We also must consider: + +- Do we trust the user/maintainer +- Not validating input on our code +- Hardcoding secrets vs env or secrets management +- Trusting code without validation +- Adding your secrets to public repositories (How many people have done this?) + + Now going back to the overall topic, DevSecOps, everything we are doing or striving towards is faster iterations of our application or software, but this means we can introduce defects and risks faster. + + We will also likely be deploying our infrastructure with code, another risk is including bad code here that lets bad actors in via defects. + + Deployments will also include application configuration management, another level of possible defects. + + However! Faster iterations can and do mean faster fixes as well. + + ### OWASP - Open Web Application Security Project + +*"[OWASP](https://owasp.org/) is a non-profit foundation that works to improve the security of software. Through community-led open-source software projects, hundreds of local chapters worldwide, tens of thousands of members, and leading educational and training conferences, the OWASP Foundation is the source for developers and technologists to secure the web."* + +If we look at their most recent data set and their [top 10](https://owasp.org/www-project-top-ten/) we can see the following big ticket items for why things go bad and wrong. + +1. Broken Access Control +2. Cryptographic Failures +3. Injection (2020 #1) +4. Insecure Design (New for 2021) +5. Security Misconfiguration +6. Vulnerable and Outdated Components (2020 #9) +7. Identification and authentication failures (2020 #2) +8. Software and Data integrity failures (New for 2021) +9. Security logging and monitoring failures (2020 #10) +10. Server-side request forgery (SSRF) + +### Back to the App + +**The warning above still stands, I will deploy this to a local VirtualBox VM IF you do decide to deploy this to a cloud instance then please firstly be careful and secondly know how to lock down your cloud provider to only your own remote IP!** + +Ok I think that is enough warnings, I am sure we might see the red warnings over the next few weeks some more as we get deeper into discussing this topic. + +The application that I am going to be using will be from [DevSecOps.org](https://github.com/devsecops/bootcamp/blob/master/Week-2/README.md) This was one of their bootcamps years ago but still allows us to show what a bad app looks like. + +Having the ability to see a bad or a weak application means we can start to understand how to secure it. + +Once again, I will be using VirtualBox on my local machine and I will be using the following vagrantfile (link here to intro on vagrant) + +The first alarm bell is that this vagrant box was created over 2 years ago! + +``` +Vagrant.configure("2") do |config| + config.vm.box = "centos/7" + config.vm.provider :virtualbox do |v| + v.memory = 8096 + v.cpus = 4 +end +end +``` +If navigate to this folder, you can use `vagrant up` to spin up your centos7 machine in your environment. + +![](images/day06-1.png) + + +Then we will need to access our machine, you can do this with `vagrant ssh` + +We are then going to install MariaDB as a local database to use in our application. + +`sudo yum -y install mariadb mariadb-server mariadb-devel` + +start the service with + +`sudo systemctl start mariadb.service` + +We have to install some dependencies, this is also where I had to change what the Bootcamp suggested as NodeJS was not available in the current repositories. + +`sudo yum -y install links` +`sudo yum install --assumeyes epel-release` +`sudo yum install --assumeyes nodejs` + +You can confirm you have node installed with `node -v` and `npm -v` (npm should be installed as a dependency) + +For this app we will be using ruby a language we have not covered at all yet and we will not really get into much detail about it, I will try and find some good resources and add them below. + +Install with + +`curl -L https://get.rvm.io | bash -s stable` + +You might with the above be asked to add keys follow those steps. + +For us to use rvm we need to do the following: + +`source /home/vagrant/.rvm/scripts/rvm` + +and finally, install it with + +`rvm install ruby-2.7` + +the reason for this long-winded process is basically because the centos7 box we are using is old and old ruby is shipped in the normal repository etc. + +Check installation and version with + +`ruby --version` + +We next need the Ruby on Rails framework which can be gathered using the following command. + +`gem install rails` + +Next, we need git and we can get this with + +`sudo yum install git` + +Just for the record and not sure if it is required, I also had Redis installed on my machine as I was doing something else but it actually still might be needed so these are the steps. + +``` +sudo yum install epel-release +sudo yum install redis +``` + +The above could be related to turbo streams but I did not have time to learn more about ruby on rails. + +Now let’s finally create our application (for the record I went through a lot to make sure these steps worked on my system so I am sending you all the luck) + +create the app with the following, calling it what you wish + +`rails new myapp --skip-turbolinks --skip-spring --skip-test-unit -d mysql ` + +next, we will create the database and schema: + +``` +cd myapp +bundle exec rake db:create +bundle exec rake db:migrate +``` + +We can then run our app with `bundle exec rails server -b 0.0.0.0` + +![](images/day06-2.png) + +Then open a browser to hit that box, I had to change my VirtualBox VM networking to bridged vs NAT so that I would be able to navigate to it vs using vagrant ssh. + +![](images/day06-3.png) + +Now we need to **scaffold** a basic model + +A scaffold is a set of automatically generated files which forms the basic structure of a Rails project. + +We do this with the following commands: + +``` +bundle exec rails generate scaffold Bootcamp name:string description:text dates:string +bundle exec rake db:migrate +``` + +![](images/day06-4.png) + +Add a default route to config/routes.rb + +`root bootcamps#index` + +![](images/day06-5.png) + +Now edit app/views/bootcamps/show.html.erb and make the description field a raw field. Add the below. + +``` +

+ Description: + <%=raw @bootcamp.description %> +

+``` +Now why this is all relevant is that using raw in the description field means that this field now becomes a potential XSS target. Or cross-site scripting. + +This can be explained better with a video [What is Cross-Site Scripting?](https://youtu.be/DxsmEXicXEE) + +The rest of the Bootcamp goes on to add in search functionality which also increases the capabilities around an XSS attack and this is another great example of a demo attack you could try out on a [vulnerable app](https://www.softwaretestinghelp.com/cross-site-scripting-xss-attack-test/). + +### Create search functionality + +In app/controllers/bootcamps_controller.rb, we'll add the following logic to the index method: + +``` +def index + @bootcamps = Bootcamp.all + if params[:search].to_s != '' + @bootcamps = Bootcamp.where("name LIKE '%#{params[:search]}%'") + else + @bootcamps = Bootcamp.all + end +end +``` + +In app/views/bootcamps/index.html.erb, we'll add the search field: + +``` +

Search

+<%= form_tag(bootcamps_path, method: "get", id: "search-form") do %> + <%= text_field_tag :search, params[:search], placeholder: "Search Bootcamps" %> + <%= submit_tag "Search Bootcamps"%> +<% end %> + +

Listing Bootcamps

+``` + +Massive thanks for [DevSecOps.org](https://www.devsecops.org/) this is where I found the old but great walkthrough with a few tweaks above, there is also so much more information to be found there. + +With that much longer walkthrough than anticipated I am going to hand over to the next sections and authors to highlight how not to do this and how to make sure we are not releasing bad code or vulnerabilities out there into the wild. + +## Resources + +- [devsecops.org](https://www.devsecops.org/) + +- [TechWorld with Nana - What is DevSecOps? DevSecOps explained in 8 Mins](https://www.youtube.com/watch?v=nrhxNNH5lt0&list=PLsKoqAvws1pvg7qL7u28_OWfXwqkI3dQ1&index=1&t=19s) + +- [What is DevSecOps?](https://www.youtube.com/watch?v=J73MELGF6u0&list=PLsKoqAvws1pvg7qL7u28_OWfXwqkI3dQ1&index=2&t=1s) + +- [freeCodeCamp.org - Web App Vulnerabilities - DevSecOps Course for Beginners](https://www.youtube.com/watch?v=F5KJVuii0Yw&list=PLsKoqAvws1pvg7qL7u28_OWfXwqkI3dQ1&index=3&t=67s) + +- [The Importance of DevSecOps and 5 Steps to Doing it Properly (DevSecOps EXPLAINED)](https://www.youtube.com/watch?v=KaoPQLyWq_g&list=PLsKoqAvws1pvg7qL7u28_OWfXwqkI3dQ1&index=4&t=13s) + +- [Continuous Delivery - What is DevSecOps?](https://www.youtube.com/watch?v=NdvMUcWNlFw&list=PLsKoqAvws1pvg7qL7u28_OWfXwqkI3dQ1&index=5&t=6s) + +- [Cloud Advocate - What is DevSecOps?](https://www.youtube.com/watch?v=a2y4Oj5wrZg&list=PLsKoqAvws1pvg7qL7u28_OWfXwqkI3dQ1&index=6) + +- [Cloud Advocate - DevSecOps Pipeline CI Process - Real world example!](https://www.youtube.com/watch?v=ipe08lFQZU8&list=PLsKoqAvws1pvg7qL7u28_OWfXwqkI3dQ1&index=7&t=204s) + +See you on [Day 7](day07.md) Where we will start a new section on Secure Coding. + diff --git a/2023/day07.md b/2023/day07.md index e69de29..d04a512 100644 --- a/2023/day07.md +++ b/2023/day07.md @@ -0,0 +1,42 @@ +# Day 7: Secure Coding Overview + +Secure coding is the practice of writing software in a way that ensures the security of the system and the data it processes. It involves designing, coding, and testing software with security in mind to prevent vulnerabilities and protect against potential attacks. + +There are several key principles of secure coding that developers should follow: + +1. Input validation: It is important to validate all user input to ensure that it is in the expected format and does not contain any malicious code or unexpected characters. This can be achieved through the use of regular expressions, data type checks, and other validation techniques. +2. Output encoding: Output data should be properly encoded to prevent any potential injection attacks. For example, HTML output should be properly escaped to prevent cross-site scripting (XSS) attacks, and SQL queries should be parameterized to prevent SQL injection attacks. +3. Access control: Access control involves restricting access to resources or data to only those users who are authorized to access them. This can include implementing authentication and authorization protocols, as well as enforcing least privilege principles to ensure that users have only the access rights they need to perform their job duties. +4. Error handling: Error handling is the process of properly handling errors and exceptions that may occur during the execution of a program. This can include logging errors, displaying appropriate messages to users, and mitigating the impact of errors on system security. +5. Cryptography: Cryptography should be used to protect sensitive data and communications, such as passwords, financial transactions, and sensitive documents. This can be achieved through the use of encryption algorithms and secure key management practices. +6. Threat Modeling: Document, locate, address, and validate are the four steps to threat modeling. To securely code, you need to examine your software for areas susceptible to increased threats of attack. Threat modeling is a multi-stage process that should be integrated into the software lifecycle from development, testing, and production. +7. Secure storage: Secure storage involves properly storing and handling sensitive data, such as passwords and personal information, to prevent unauthorized access or tampering. This can include using encryption, hashing, and other security measures to protect data at rest and in transit. +8. Secure architecture: Secure architecture is the foundation of a secure system. This includes designing systems with security in mind, using secure frameworks and libraries, and following secure design patterns. + +There are several tools and techniques that can be used to help ensure that code is secure, including Static Application Security Testing (SAST), Software Composition Analysis (SCA), and Secure Code Review. + +### Static Application Security Testing (SAST) + +SAST is a method of testing software code for security vulnerabilities during the development phase. It involves analyzing the source code of a program without executing it, looking for vulnerabilities such as injection attacks, cross-site scripting (XSS), and other common security issues. SAST tools can be integrated into the software development process to provide ongoing feedback and alerts about potential vulnerabilities as the code is being written. + +### Software Composition Analysis (SCA) + +SCA is a method of analyzing the third-party components and libraries that are used in a software application. It helps to identify any vulnerabilities or security risks that may be present in these components, and can alert developers to the need to update or replace them. SCA can be performed manually or with the use of automated tools. + +### Secure Code Reviews + +Secure Code Review is a process of reviewing software code with the goal of identifying and addressing potential security vulnerabilities. It is typically performed by a team of security experts who are familiar with common coding practices and security best practices. Secure Code Review can be done manually or with the use of automated tools, and may involve a combination of SAST and SCA techniques. + +In summary, Overall, secure coding is a crucial practice that helps protect software and its users from security vulnerabilities and attacks. By following best practices and keeping software up to date, developers can help ensure that their software is as secure as possible. + +### Resources + +- [Secure Coding Best Practices | OWASP Top 10 Proactive Control](https://www.youtube.com/watch?v=8m1N2t-WANc) + +- [Secure coding practices every developer should know](https://snyk.io/learn/secure-coding-practices/) + +- [10 Secure Coding Practices You Can Implement Now](https://codesigningstore.com/secure-coding-practices-to-implement) + +- [Secure Coding Guidelines And Best Practices For Developers](https://www.softwaretestinghelp.com/guidelines-for-secure-coding/) + +In the next part [Day 8](day08.md), we will discuss Static Application Security Testing (SAST) in more detail. diff --git a/2023/day08.md b/2023/day08.md index e69de29..1af912c 100644 --- a/2023/day08.md +++ b/2023/day08.md @@ -0,0 +1,54 @@ +# Day 8: SAST Overview + +Static Application Security Testing (SAST) is a method of evaluating the security of an application by analyzing the source code of the application without executing the code. SAST is also known as white-box testing as it involves testing the internal structure and workings of an application. + +SAST is performed early in the software development lifecycle (SDLC) as it allows developers to identify and fix vulnerabilities before the application is deployed. This helps prevent security breaches and minimizes the risk of costly security incidents. + +One of the primary benefits of SAST is that it can identify vulnerabilities that may not be detected by other testing methods such as dynamic testing or manual testing. This is because SAST analyzes the entire codebase and can identify vulnerabilities that may not be detectable by other testing methods. + +There are several types of vulnerabilities that SAST can identify, including: + +- **Input validation vulnerabilities**: These vulnerabilities occur when an application does not adequately validate user input, allowing attackers to input malicious code or data that can compromise the security of the application. +- **Cross-site scripting (XSS) vulnerabilities**: These vulnerabilities allow attackers to inject malicious scripts into web applications, allowing them to steal sensitive information or manipulate the application for their own gain. +- **Injection vulnerabilities**: These vulnerabilities allow attackers to inject malicious code or data into the application, allowing them to gain unauthorized access to sensitive information or execute unauthorized actions. +- **Unsafe functions and libraries**: These vulnerabilities occur when an application uses unsafe functions or libraries that can be exploited by attackers. +- **Security misconfigurations**: These vulnerabilities occur when an application is not properly configured, allowing attackers to gain access to sensitive information or execute unauthorized actions. + +### SAST Tools (with free tier plan) + +- **[SonarCloud](https://www.sonarsource.com/products/sonarcloud/)**: SonarCloud is a cloud-based code analysis service designed to detect code quality issues in 25+ different programming languages, continuously ensuring the maintainability, reliability and security of your code. +- **[Snyk](https://snyk.io/)**: Snyk is a platform allowing you to scan, prioritize, and fix security vulnerabilities in your own code, open source dependencies, container images, and Infrastructure as Code (IaC) configurations. +- **[Semgrep](https://semgrep.dev/)**: Semgrep is a fast, open source, static analysis engine for finding bugs, detecting dependency vulnerabilities, and enforcing code standards. + +## How SAST Works? + +SAST tools typically use a variety of techniques to analyze the sourced code, including pattern matching, rule-based analysis, and data flow analysis. + +Pattern matching involves looking for specific patterns in the code that may indicate a vulnerability, such as the use of a known vulnerable library or the execution of user input without proper sanitization. + +Rule-based analysis involves the use of a set of predefined rules to identify potential vulnerabilities, such as the use of weak cryptography or the lack of input validation. + +Data flow analysis involves tracking the flow of data through the application and identifying potential vulnerabilities that may arise as a result, such as the handling of sensitive data in an insecure manner. + +## Consideration while using SAST Tools + +1. It is important to ensure that the tool is properly configured and that it is being used in a way that is consistent with best practices. This may include setting the tool's sensitivity level to ensure that it is properly identifying vulnerabilities, as well as configuring the tool to ignore certain types of vulnerabilities that are known to be benign. +2. SAST tools are not a replacement for manual code review. While these tools can identify many potential vulnerabilities, they may not be able to identify all of them, and it is important for developers to manually review the code to ensure that it is secure. +3. SAST is just one aspect of a comprehensive application security program. While it can be an important tool for identifying potential vulnerabilities, it is not a replacement for other security measures, such as secure coding practices, testing in the production environment, and ongoing monitoring and maintenance. + +### Challenges associated with SAST + +- **False positives**: Automated SAST tools can sometimes identify potential vulnerabilities that are not actually vulnerabilities. This can lead to a large number of false positives that need to be manually reviewed, increasing the time and cost of the testing process. +- **Limited coverage**: SAST can only identify vulnerabilities in the source code that is analyzed. If an application uses external libraries or APIs, these may not be covered by the SAST process. +- **Code complexity**: SAST can be more challenging for larger codebases or codebases that are written in languages that are difficult to analyze. +- **Limited testing**: SAST does not execute the code and therefore cannot identify vulnerabilities that may only occur when the code is executed. + +Despite these challenges, SAST is a valuable method of evaluating the security of an application and can help organizations prevent security breaches and minimize the risk of costly security incidents. By identifying and fixing vulnerabilities early in the SDLC, organizations can build more secure applications and improve the overall security of their systems. + +### Resources + +- [SAST- Static Analysis with lab by Practical DevSecOps](https://www.youtube.com/watch?v=h37zp5g5tO4) +- [SAST – All About Static Application Security Testing](https://www.mend.io/resources/blog/sast-static-application-security-testing/) +- [SAST Tools : 15 Top Free and Paid Tools](https://www.appsecsanta.com/sast-tools) + +In the next part [Day 9](day09.md), we will discuss SonarCloud and integrate it with different CI/CD tools. diff --git a/2023/day09.md b/2023/day09.md index e69de29..0a104be 100644 --- a/2023/day09.md +++ b/2023/day09.md @@ -0,0 +1,132 @@ +# Day 9: SAST Implementation with SonarCloud + +SonarCloud is a cloud-based platform that provides static code analysis to help developers find and fix code quality issues in their projects. It is designed to work with a variety of programming languages and tools, including Java, C#, JavaScript, and more. + +SonarCloud offers a range of features to help developers improve the quality of their code, including: + +- **Static code analysis**: SonarCloud analyzes the source code of a project and checks for issues such as coding style violations, potential bugs, security vulnerabilities, and other problems. It provides developers with a detailed report of the issues it finds, along with suggestions for how to fix them. +- **Code review**: SonarCloud integrates with code review tools like GitHub pull requests, allowing developers to receive feedback on their code from their peers before it is merged into the main branch. This helps to catch issues early on in the development process, reducing the risk of bugs and other issues making it into production. +- **Continuous integration**: SonarCloud can be integrated into a continuous integration (CI) pipeline, allowing it to automatically run static code analysis on every code commit. This helps developers catch issues early and fix them quickly, improving the overall quality of their codebase. +- **Collaboration**: SonarCloud includes tools for team collaboration, such as the ability to assign issues to specific team members and track the progress of code review and issue resolution. +- **Customization**: SonarCloud allows developers to customize the rules and configurations used for static code analysis, so they can tailor the analysis to fit the specific needs and coding standards of their team. + +Overall, SonarCloud is a valuable tool for developers looking to improve the quality of their code and reduce the risk of issues making it into production. It helps teams collaborate and catch problems early on in the development process, leading to faster, more efficient development and fewer bugs in the final product. + +Read more about SonarCloud [here](https://docs.sonarcloud.io/) + +### Integrate SonarCloud with GitHub Actions + +- Sign up for a [SonarCloud](https://sonarcloud.io/) account with your GitHub Account. +- From the dashboard, click on “Import an organization from GitHub” + +![](images/day09-1.png) + +- Authorise and install SonarCloud app to access your GitHub account. + +![](images/day09-2.png) + +- Select the repository (free tier supports only public repositories) you want to analyze and click "Install" +![](images/day09-3.png) + +- In SonarCloud you can now create an organisation. + +![](images/day09-4.png) +![](images/day09-5.png) + +- Now click on “Analyze a new Project” + +![](images/day09-6.png) + +- Click on setup to add the Project. + +![](images/day09-7.png) + +- Now on the SonarCloud dashboard you can the project. + +![](images/day09-8.png) + +- To setup the GitHub Actions, click on the project, then on **Information** > **Last analysis method** + +![](images/day09-9.png) + +- Click on **GitHub Actions** + +![](images/day09-10.png) + +- This will show some steps to integrate SonarCloud with GitHub actions. At the top you will see SONAR_TOKEN, we will add that as Github Secrets later. + +![](images/day09-11.png) + +- Next thing you will see is the yaml file for the GitHub Workflow + +![](images/day09-12.png) + +- You will also see a configuration file that we will have to add in the source code repo + +![](images/day09-13.png) +![](images/day09-14.png) + +- At the bottom of page, disable the Automatic Analysis +![](images/day09-15.png) + +- Now go the source code repo and add the following configuration `sonar-project.properties` in the root directory. + +```yaml +sonar.projectKey=prateekjaindev_nodejs-todo-app-demo +sonar.organization=prateekjaindev + +# This is the name and version displayed in the SonarCloud UI. +#sonar.projectName=nodejs-todo-app-demo +#sonar.projectVersion=1.0 + +# Path is relative to the sonar-project.properties file. Replace "\" by "/" on Windows. +#sonar.sources=. + +# Encoding of the source code. Default is default system encoding +#sonar.sourceEncoding=UTF-8 +``` + +- Update or add the GitHub actions workflow with the following job in the `.github/workflows` directory + +```yaml +name: SonarScan +on: + push: + branches: + - main + pull_request: + types: [opened, synchronize, reopened] +jobs: + sonarcloud: + name: SonarCloud + runs-on: ubuntu-latest + steps: + - uses: actions/checkout@v2 + with: + fetch-depth: 0 # Shallow clones should be disabled for a better relevancy of analysis + - name: SonarCloud Scan + uses: SonarSource/sonarcloud-github-action@master + env: + GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} # Needed to get PR information, if any + SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }} +``` +- Now go to GitHub and add GitHub Secret named SOANR_TOKEN. +![](images/day09-16.png) +- As soon as you commit the changes, the workflow will trigger. +![](images/day09-17.png) +- Now after every commit, you can check the updated reports on the SonarCloud dashboard. +![](images/day09-18.png) + +### Quality Gates + +A quality gate is an indicator that tells you whether your code meets the minimum level of quality required for your project. It consists of a set of conditions that are applied to the results of each analysis. If the analysis results meet or exceed the quality gate conditions then it shows a **Passed** status otherwise, it shows a **Failed** status. + +By default SonarCloud comes with a default quality gate “Sonar way”. You can edit or create new one in the Organisation Settings. +![](images/day09-19.png) +### Resources + +- [SonarCloud Documentation](https://docs.sonarcloud.io/) +- [How to create Quality gates on SonarQube](https://www.youtube.com/watch?v=8_Xt9vchlpY) +- [Source Code of the repo I used for SAST implementation](https://github.com/prateekjaindev/nodejs-todo-app-demo) + +In the next part [Day 10](day10.md), we will discuss Software Composition Analysis (SCA). \ No newline at end of file diff --git a/2023/day10.md b/2023/day10.md index e69de29..6ea50b1 100644 --- a/2023/day10.md +++ b/2023/day10.md @@ -0,0 +1,33 @@ +# Day 10: Software Composition Analysis Overview + +Software composition analysis (SCA) is a process that helps developers identify the open source libraries, frameworks, and components that are included in their software projects. SCA tools scan the codebase of a software project and provide a report that lists all the open source libraries, frameworks, and components that are being used. This report includes information about the licenses and vulnerabilities of these open source libraries and components, as well as any security risks that may be associated with them. + +There are several benefits to using SCA tools in software development projects. These benefits include: + +1. **Improved security**: By identifying the open source libraries and components that are being used in a project, developers can assess the security risks associated with these libraries and components. This allows them to take appropriate measures to fix any vulnerabilities and protect their software from potential attacks. +2. **Enhanced compliance**: SCA tools help developers ensure that they are using open source libraries and components that are compliant with the appropriate licenses. This is particularly important for companies that have strict compliance policies and need to ensure that they are not infringing on any third-party intellectual property rights. +3. **Improved efficiency**: SCA tools can help developers save time and effort by automating the process of identifying and tracking open source libraries and components. This allows developers to focus on more important tasks, such as building and testing their software. +4. **Reduced risk**: By using SCA tools, developers can identify and fix vulnerabilities in open source libraries and components before they become a problem. This helps to reduce the risk of security breaches and other issues that could damage the reputation of the software and the company. +5. **Enhanced quality**: By identifying and addressing any vulnerabilities in open source libraries and components, developers can improve the overall quality of their software. This leads to a better user experience and a higher level of customer satisfaction. + +In addition to these benefits, SCA tools can also help developers to identify any potential legal issues that may arise from the use of open source libraries and components. For example, if a developer is using a library that is licensed under a copyleft license, they may be required to share any changes they make to the library with the community. + +Despite these benefits, there are several challenges associated with SCA: + +1. **Scale**: As the use of open source software has become more widespread, the number of components that need to be analyzed has grown exponentially. This can make it difficult for organizations to keep track of all the components they are using and to identify any potential issues. +2. **Complexity**: Many software applications are made up of a large number of components, some of which may have been added years ago and are no longer actively maintained. This can make it difficult to understand the full scope of an application and to identify any potential issues. +3. **False positives**: SCA tools can generate a large number of alerts, some of which may be false positives. This can be frustrating for developers who have to review and dismiss these alerts, and it can also lead to a lack of trust in the SCA tool itself. +4. **Lack of standardization**: There is no standard way to conduct SCA, and different tools and approaches can produce different results. This can make it difficult for organizations to compare the results of different SCA tools and to determine which one is best for their needs. + +Overall, SCA tools provide a number of benefits to software developers and can help to improve the security, compliance, efficiency, risk management, and quality of software projects. By using these tools, developers can ensure that they are using open source libraries and components that are compliant with the appropriate licenses, free of vulnerabilities, and of high quality. This helps to protect the reputation of their software and the company, and leads to a better user experience. + +### SCA Tools (Opensource or Free Tier) +- **[OWASP Dependncy Check](https://owasp.org/www-project-dependency-check/)**: Dependency-Check is a Software Composition Analysis (SCA) tool that attempts to detect publicly disclosed vulnerabilities contained within a project’s dependencies. It does this by determining if there is a Common Platform Enumeration (CPE) identifier for a given dependency. If found, it will generate a report linking to the associated CVE entries. +- **[Snyk](https://snyk.io/product/open-source-security-management/)**: Snyk Open Source provides a developer-first SCA solution, helping developers find, prioritize, and fix security vulnerabilities and license issues in open source dependencies. + +### Resources + +- [Software Composition Analysis (SCA): What You Should Know](https://www.aquasec.com/cloud-native-academy/supply-chain-security/software-composition-analysis-sca/) +- [Software Composition Analysis 101: Knowing what’s inside your apps - Magno Logan](https://www.youtube.com/watch?v=qyVDHH4T1oo) + +In the next part [Day 11](day11.md), we will discuss Dependency Check and integrate it with GitHub Actions. \ No newline at end of file diff --git a/2023/day11.md b/2023/day11.md index e69de29..97a7c16 100644 --- a/2023/day11.md +++ b/2023/day11.md @@ -0,0 +1,69 @@ +# Day 11: SCA Implementation with OWASP Dependency Check + +### OWASP Dependency Check + +OWASP Dependency Check is an open-source tool that checks project dependencies for known vulnerabilities. It can be used to identify dependencies with known vulnerabilities and determine if any of those vulnerabilities are exposed in the application. + +The tool works by scanning the dependencies of a project and checking them against a database of known vulnerabilities. If a vulnerability is found, the tool will report the vulnerability along with the associated CVE (Common Vulnerabilities and Exposures) identifier, a standardized identifier for publicly known cybersecurity vulnerabilities. + +To use OWASP Dependency Check, you will need to include it as a part of your build process. There are integrations available for a variety of build tools, including Maven, Gradle, and Ant. You can also use the command-line interface to scan your dependencies. + +OWASP Dependency Check is particularly useful for identifying vulnerabilities in third-party libraries and frameworks that your application depends on. These types of dependencies can introduce vulnerabilities into your application if they are not properly managed. By regularly scanning your dependencies, you can ensure that you are aware of any vulnerabilities and take steps to address them. + +It is important to note that OWASP Dependency Check is not a replacement for secure coding practices and should be used in conjunction with other security measures. It is also important to regularly update dependencies to ensure that you are using the most secure version available. + +### Integrate Dependency Check with GitHub Actions + +To use Dependency Check with GitHub Actions, you can create a workflow file in your repository's `.github/workflows` directory. Here is an example workflow that runs Dependency Check on every push to the `main` branch: + +```yaml +name: Dependency-Check +on: + push: + branches: + - main + pull_request: + types: [opened, synchronize, reopened] +jobs: + dependency-check: + name: Dependency-Check + runs-on: ubuntu-latest + steps: + - name: Download OWASP Dependency Check + run: | + VERSION=$(curl -s https://jeremylong.github.io/DependencyCheck/current.txt) + curl -sL "https://github.com/jeremylong/DependencyCheck/releases/download/v$VERSION/dependency-check-$VERSION-release.zip" --output dependency-check.zip + unzip dependency-check.zip + - name: Run Dependency Check + run: | + ./dependency-check/bin/dependency-check.sh --out report.html --scan . + rm -rf dependency-check* + + - name: Upload Artifacts + uses: actions/upload-artifact@v2 + with: + name: artifacts + path: report.html +``` + +This workflow does the following: + +1. Defines a workflow called `Dependency-Check` that runs on every push to the `main` branch. +2. Specifies that the workflow should run on the `ubuntu-latest` runner. +3. Downloads and installs Dependency Check. +4. Runs Dependency Check on the current directory (`.`) and generate a report in report.html file. +5. Removes the downloaded Dependency Check files. +6. Upload the report file as artifacts. + +You can download the report from the Artifacts and open it in the Browser. + +![](images/day11-1.png) + +You can customize this workflow to fit your needs. For example, you can specify different branches to run the workflow on, or specify different dependencies to check. You can also configure Dependency Check to generate a report in a specific format (e.g., HTML, XML, JSON) and save it to the repository. + +### Resources + +- [Dependency Check Documentation](https://jeremylong.github.io/DependencyCheck/) +- [Source Code of the repo I used for SCA implementation](https://github.com/prateekjaindev/nodejs-todo-app-demo) + +In the next part [Day 12](day12.md), we will discuss Secure Coding Review. \ No newline at end of file diff --git a/2023/day12.md b/2023/day12.md index e69de29..30b24ca 100644 --- a/2023/day12.md +++ b/2023/day12.md @@ -0,0 +1,33 @@ +# Day 12: Secure Coding Review + +Secure code review is the process of examining and evaluating the security of a software application or system by reviewing the source code for potential vulnerabilities or weaknesses. This process is an essential part of ensuring that an application is secure and can withstand attacks from cyber criminals. + +There are several steps involved in a secure code review process: + +1. **Identify the scope of the review**: The first step is to identify the scope of the review, including the type of application being reviewed and the specific security concerns that need to be addressed. +2. **Set up a review team**: A review team should be composed of individuals with expertise in different areas, such as security, coding, and testing. The team should also include individuals who are familiar with the application being reviewed. +3. **Prepare the code for review**: Before the review can begin, the code needs to be prepared for review by organizing it in a way that makes it easier to understand and review. This may include breaking the code down into smaller chunks or adding comments to explain the purpose of specific sections. +4. **Conduct the review**: During the review, the team will examine the code for vulnerabilities and weaknesses. This may include checking for insecure coding practices, such as hardcoded passwords or unencrypted data, or looking for vulnerabilities in the application’s architecture. +5. **Document findings**: As the team identifies potential vulnerabilities or weaknesses, they should document their findings in a report. The report should include details about the vulnerability, the potential impact, and recommendations for how to fix the issue. +6. **Remediate vulnerabilities**: Once the review is complete, the team should work with the development team to fix any vulnerabilities or weaknesses that were identified. This may involve updating the code, implementing additional security controls, or both. + +There are several tools and techniques that can be used to facilitate a secure code review. These may include: + +1. **Static analysis tools**: These tools analyze the code without executing it, making them useful for identifying vulnerabilities such as buffer overflows, SQL injection, and cross-site scripting. +2. **Dynamic analysis tools**: These tools analyze the code while it is being executed, allowing the review team to identify vulnerabilities that may not be detectable through static analysis alone. +3. **Code review guidelines**: Many organizations have developed guidelines for conducting code reviews, which outline the types of vulnerabilities that should be looked for and the best practices for remediation. +4. **Peer review**: Peer review is a process in which other developers review the code, providing a second set of eyes to identify potential vulnerabilities. + +Secure code review is an ongoing process that should be conducted at various stages throughout the development lifecycle. This includes reviewing code before it is deployed to production, as well as conducting periodic reviews to ensure that the application remains secure over time. + +Overall, secure code review is a critical component of ensuring that an application is secure. By identifying and addressing vulnerabilities early in the development process, organizations can reduce the risk of attacks and protect their systems and data from potential threats. + +I highly recommend watching this video to understand how source code analysis can lead to finding vulnerabilities in large enterprise codebases. + +[![Final video of fixing issues in your code in VS Code](https://img.youtube.com/vi/fb-t3WWHsMQ/maxresdefault.jpg)](https://www.youtube.com/watch?v=fb-t3WWHsMQ) +### Resources + +- [How to Analyze Code for Vulnerabilities](https://www.youtube.com/watch?v=A8CNysN-lOM&t) +- [What Is A Secure Code Review And Its Process?](https://valuementor.com/blogs/source-code-review/what-is-a-secure-code-review-and-its-process/) + +In the next part [Day 13](day13.md), we will discuss Additional Secure Coding Practices with some more hands-on. \ No newline at end of file diff --git a/2023/day13.md b/2023/day13.md index e69de29..721f510 100644 --- a/2023/day13.md +++ b/2023/day13.md @@ -0,0 +1,89 @@ +# Day 13: Additional Secure Coding Practices + +## Git Secret Scan + +Scanning repositories for secrets refers to the process of searching through a code repository, such as on GitHub or GitLab, for sensitive information that may have been inadvertently committed and pushed to the repository. This can include sensitive data such as passwords, API keys, and private encryption keys. + +The process is usually done using automated tools that scan the code for specific patterns or keywords that indicate the presence of sensitive information. The goal of this process is to identify and remove any secrets that may have been exposed in the repository, in order to protect against potential breaches or unauthorized access. + +### Git Secret Scan with Gitleaks + +Gitleaks is a tool that can be added to your GitHub repository as a GitHub Action, which scans your codebase for sensitive information such as credentials, tokens, and other secrets. The action runs the gitleaks tool on your codebase, which checks for any sensitive information that may have been accidentally committed to your repository. + +To set up Gitleaks GitHub Action, you need to create a new workflow file in your repository's `.github/workflows/git-secret-scan.yml` directory. The workflow file should contain the following: + +```yaml +name: gitleaks +on: + pull_request: + push: +jobs: + scan: + name: gitleaks + runs-on: ubuntu-latest + steps: + - uses: actions/checkout@v3 + with: + fetch-depth: 0 + - uses: gitleaks/gitleaks-action@v2 + env: + GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} +``` + + +This workflow does the following: + +1. Defines a workflow called `Dependency-Check` that runs on every push to the `main` branch. +2. Specifies that the workflow should run on the `ubuntu-latest` runner. +3. Runs gitleaks scan for the entire repository +4. This action will fail if it detects any secret. + +In my demo, I have added AWS Keys in .env file and because of that the pipeline faild. + +![](images/day13-1.png) + +Other Git Scanner tools + +- [**AWS git-secrets**](https://github.com/awslabs/git-secrets) +- **[GitGuardian ggshield](https://github.com/GitGuardian/ggshield)** +- **[TruffleHog](https://github.com/trufflesecurity/trufflehog)** + +### Resources +- [Gitleaks GitHub](https://github.com/zricethezav/gitleaks) +- [Gitleaks GitHub Action](https://github.com/gitleaks/gitleaks-action) +## Create better Dockerfile with Hadolint + +Hadolint is a linter for Dockerfiles that checks for common mistakes and provides suggestions for improvement. It can be used directly from the command line, integrated into a CI/CD pipeline, or integrated into code editors and IDEs for real-time linting. + +To set up linting with hadolint in Github Actions, you can use the following steps: + +1. Create a new workflow file in your repository, for example `.github/workflows/dockerfile-lint.yml` +2. In this file, add the following code to set up the Github Actions workflow: + +```yaml +name: Lint Dockerfile +on: + push: + branches: + - main +jobs: + lint: + runs-on: ubuntu-latest + steps: + - uses: actions/checkout@v2 + - uses: hadolint/hadolint-action@v2.1.0 + with: + dockerfile: Dockerfile +``` + +1. This workflow will run on every push to the "main" branch, and will run the hadolint command on the "Dockerfile" file. +2. Commit the new workflow file and push it to your repository. +3. Next time you push changes to the "main" branch, Github Actions will run the linting job and provide feedback if any issues are found with your Dockerfile. + +### Resources + +- [Hadolint GitHub](https://github.com/hadolint/hadolint) +- [Hadolint Online](https://hadolint.github.io/hadolint/) +- [Top 20 Dockerfile best practices](https://sysdig.com/blog/dockerfile-best-practices/) + +Next up we will be starting our **Continuous Build, Integration, Testing** with [Day 14](day14.md) covering Container Image Scanning from [Anton Sankov](https://twitter.com/a_sankov). \ No newline at end of file diff --git a/2023/day15.md b/2023/day15.md index 94bc680..ee6977b 100644 --- a/2023/day15.md +++ b/2023/day15.md @@ -228,3 +228,6 @@ It is between 0 and 10. + + +On [Day 16](day16.md) we will take a look into "Fuzzing" or Fuzz Testing. \ No newline at end of file diff --git a/2023/day17.md b/2023/day17.md index 8a9fc53..cf7f191 100644 --- a/2023/day17.md +++ b/2023/day17.md @@ -1,26 +1,242 @@ -# DAST -DAST, or Dynamic Application Security Testing, is a technique that is used to evaluate the security of an application by simulating attacks from external sources. -Idea is to automate as much as possible black-box penetration testing. -It can be used for acquiring the low-hanging fruits so a real human’s time will be spared and additionally for generating traffic to other security tools (e.g. IAST). +# Fuzzing Advanced -Nevertheless, It is an essential component of the SSDLC, as it helps organizations uncover potential vulnerabilities early in the development process, before the application is deployed to production. By conducting DAST testing, organizations can prevent security incidents and protect their data and assets from being compromised by attackers. +Yesterday we learned what fuzzing is and how to write fuzz tests (unit tests with fuzzy inputs). +However, fuzz testing goes beyond just unit testing. +We can use this methodology to test our web application by fuzzing the requests sent to our server. -## Tools +Today, we will take a practical approach to fuzzy testing a web server. -There are various open-source tools available for conducting DAST, such as ZAP, Burp Suite, and Arachni. These tools can simulate different types of attacks on the application, such as SQL injection, cross-site scripting, and other common vulnerabilities. For example, if an application is vulnerable to SQL injection, a DAST tool can send a malicious SQL query to the application, such as ' OR 1=1 --, and evaluate its response to determine if it is vulnerable. If the application is vulnerable, it may return all records from the database, indicating that the SQL injection attack was successful. -As some of the tests could be quite invasive (for example it may include ‘DROP TABLE’ or something similar) or at least put a good amount of test data into the databases or even DOS the app, -__DAST tools should never run against a production environment!!!__ -All tools have the possibility for authentication into the application and this could lead to production credentials compromise. Also when run authenticated scans against the testing environment, use suitable roles (if RBAC model exists, for the application, of course), e.g. DAST shouldn’t use role that have the possibility to delete or modify other users because this way the whole environment can became unusable. -As with other testing methodologies it is necessary to analyze the scope, so not unneeded targets are scanned. +Different tools can help us do this. -## Usage -Common error is scanning compensating security controls (e.g. WAF) instead of the real application. DAST is in its core an application security testing tool and should be used against actual applications, not against security mitigations. As it uses pretty standardized attacks, external controls can block the attacking traffic and this way to cover potentially exploitable flows (as per definition adversary would be able to eventually bypass such measures) -Actual scans are quite slow, so sometimes they should be run outside of the DevOps pipeline. Good example is running them nightly or during the weekend. Some of the simple tools (zap / arachny, …) could be used into pipelines but often, due to the nature of the scan can slow down the whole development process. -Once the DAST testing is complete, the results are analyzed to identify any vulnerabilities that were discovered. The organization can then take appropriate remediation steps to address the vulnerabilities and improve the overall security of the application. This may involve fixing the underlying code, implementing additional security controls, such as input validation and filtering, or both. -In conclusion, the use of DAST in the SSDLC is essential for ensuring the security of an application. By conducting DAST testing and identifying vulnerabilities early in the development process, organizations can prevent security incidents and protect their assets from potential threats. Open-source tools, such as ZAP, Burp Suite, and Arachni, can be used to conduct DAST testing and help organizations improve their overall security posture. -As with all other tools part of DevSecOps pipeline DAST should not be the only scanner in place and as with all others, it is not a substitute for penetration test and good development practices. +Such tools are [Burp Intruder](https://portswigger.net/burp/documentation/desktop/tools/intruder) and [SmartBear](https://smartbear.com/). +However, there are proprietary tools that require a paid license to use them. -## Some useful links and open-source tools: -- https://github.com/zaproxy/zaproxy -- https://www.arachni-scanner.com/ -- https://owasp.org/www-project-devsecops-guideline/latest/02b-Dynamic-Application-Security-Testing +That is why for our demonstration today we are going to use a simple open-source CLI written in Go that was inspired by Burp Intruder and provides similar functionality. +It is called [httpfuzz](https://github.com/JonCooperWorks/httpfuzz). + + +## Getting started + +This tool is quite simple. +We provide it a template for our requests (in which we have defined placeholders for the fuzzy data), a wordlist (the fuzzy data) and `httpfuzz` will render the requests and send them to our server. + +First, we need to define a template for our requests. +Create a file named `request.txt` with the following content: + +```text +POST / HTTP/1.1 +Content-Type: application/json +User-Agent: PostmanRuntime/7.26.3 +Accept: */* +Cache-Control: no-cache +Host: localhost:8000 +Accept-Encoding: gzip, deflate +Connection: close +Content-Length: 35 + +{ + "name": "`S9`", +} +``` + +This is a valid HTTP `POST` request to the `/` route with JSON body. +The "\`" symbol in the body defines a placeholder that will be substituted with the data we provide. + +`httpfuzz` can also fuzz the headers, path, and URL params. + +Next, we need to provide a wordlist of inputs that will be placed in the request. +Create a file named `data.txt` with the following content: + +```text +SOME_NAME +Mozilla/5.0 (Linux; Android 7.0; SM-G930VC Build/NRD90M; wv) AppleWebKit/537.36 (KHTML, like Gecko) Version/4.0 Chrome/58.0.3029.83 Mobile Safari/537.36 +``` + +In this file, we defined two inputs that will be substituted inside the body. +In a real-world scenario, you should put much more data here for proper fuzz testing. + +Now that we have our template and our inputs, let's run the tool. +Unfortunately, this tool is not distributed as a binary, so we will have to build it from source. +Clone the repo and run: + +```shell +go build -o httpfuzz cmd/httpfuzz.go +``` + +(requires to have a recent version of Go installed on your machine). + +Now that we have the binary let's run it: + +```shell +./httpfuzz \ + --wordlist data.txt \ + --seed-request request.txt \ + --target-header User-Agent \ + --target-param fuzz \ + --delay-ms 50 \ + --skip-cert-verify \ + --proxy-url http://localhost:8080 \ +``` + +- `httpfuzz` is the binary we are invoking. +- `--wordlist data.txt` is the file with inputs we provided. +- `--seed-request requests.txt` is the request template. +- `--target-header User-Agent` tells `httpfuzz` to use the provided inputs in the place of the `User-Agent` header. +- `--target-param fuzz` tells `httpfuzz` to use the provided inputs as values for the `fuzz` URL parameter. +- `--delay-ms 50` tells `httpfuzz` to wait 50 ms between the requests. +- `--skip-cert-verify` tells `httpfuzz` to not do any TLS verification. +- `--proxy-url http://localhost:8080` tells `httpfuzz` where our HTTP server is. + +We have 2 inputs and 3 places to place them (in the body, the `User-Agent` header, and the `fuzz` parameter). +This means that `httpfuzz` will generate 6 requests and send them to our server. + +Let's run it and see what happens. +I wrote a simple web server that logs all requests so that we can see what is coming into our server: + +```shell +$ ./httpfuzz \ + --wordlist data.txt \ + --seed-request request.txt \ + --target-header User-Agent \ + --target-param fuzz \ + --delay-ms 50 \ + --skip-cert-verify \ + --proxy-url http://localhost:8080 \ + +httpfuzz: httpfuzz.go:164: Sending 6 requests +``` + +and the server logs: + +```text +----- +Got request to http://localhost:8000/ +User-Agent header = [SOME_NAME] +Name = S9 +----- +Got request to http://localhost:8000/?fuzz=SOME_NAME +User-Agent header = [PostmanRuntime/7.26.3] +Name = S9 +----- +Got request to http://localhost:8000/ +User-Agent header = [PostmanRuntime/7.26.3] +Name = SOME_NAME +----- +Got request to http://localhost:8000/ +User-Agent header = [Mozilla/5.0 (Linux; Android 7.0; SM-G930VC Build/NRD90M; wv) AppleWebKit/537.36 (KHTML, like Gecko) Version/4.0 Chrome/58.0.3029.83 Mobile Safari/537.36] +Name = S9 +----- +Got request to http://localhost:8000/?fuzz=Mozilla%2F5.0+%28Linux%3B+Android+7.0%3B+SM-G930VC+Build%2FNRD90M%3B+wv%29+AppleWebKit%2F537.36+%28KHTML%2C+like+Gecko%29+Version%2F4.083+Mobile+Safari%2F537.36 +User-Agent header = [PostmanRuntime/7.26.3] +Name = S9 +----- +Got request to http://localhost:8000/ +User-Agent header = [PostmanRuntime/7.26.3] +Name = Mozilla/5.0 (Linux; Android 7.0; SM-G930VC Build/NRD90M; wv) AppleWebKit/537.36 (KHTML, like Gecko) Version/4.0 Chrome/58.0.3029.83 Mobile Safari/537.36 +``` + +We see that we have received 6 HTTP requests. + +Two of them have a value from our values file for the `User-Agent` header, and 4 have the default header from the template. +Two of them have a value from our values file for the `fuzz` query parameter, and 4 have the default header from the template. +Two of them have a value from our values file for the `Name` body property, and 4 have the default header from the template. + +A slight improvement of the tool could be to make different permutations of these requests (for example, a request that has both `?fuzz=` and `User-Agent` as values from the values file). + +Notice how `httpfuzz` does not give us any information about the outcome of the requests. +To figure that out, we need to either set up some sort of monitoring for our server or write a `httpfuzz` plugin that will process the results in a meaningful for us way. +Let's do that. + +To write a custom plugin, we need to implement the [`Listener`](https://github.com/JonCooperWorks/httpfuzz/blob/master/plugin.go#L13) interface: + +```go +// Listener must be implemented by a plugin to users to hook the request - response transaction. +// The Listen method will be run in its own goroutine, so plugins cannot block the rest of the program, however panics can take down the entire process. +type Listener interface { + Listen(results <-chan *Result) +} +``` + +```go +package main + +import ( + "bytes" + "io/ioutil" + "log" + + "github.com/joncooperworks/httpfuzz" +) + +type logResponseCodePlugin struct { + logger *log.Logger +} + +func (b *logResponseCodePlugin) Listen(results <-chan *httpfuzz.Result) { + for result := range results { + b.logger.Printf("Got %d response from the server\n", result.Response.StatusCode) + } +} + +// New returns a logResponseCodePlugin plugin that simple logs the response code of the response. +func New(logger *log.Logger) (httpfuzz.Listener, error) { + return &logResponseCodePlugin{logger: logger}, nil +} +``` + +Now we need to build our plugin first: + +```shell +go build -buildmode=plugin -o log exampleplugins/log/log.go +``` + +and then we can plug it into `httpfuzz` via the `--post-request` flag: + +```shell +$ ./httpfuzz \ + --wordlist data.txt \ + --seed-request request.txt \ + --target-header User-Agent \ + --target-param fuzz \ + --delay-ms 50 \ + --skip-cert-verify \ + --proxy-url http://localhost:8080 \ + --post-request log + +httpfuzz: httpfuzz.go:164: Sending 6 requests +httpfuzz: log.go:15: Got 200 response from the server +httpfuzz: log.go:15: Got 200 response from the server +httpfuzz: log.go:15: Got 200 response from the server +httpfuzz: log.go:15: Got 200 response from the server +httpfuzz: log.go:15: Got 200 response from the server +httpfuzz: log.go:15: Got 200 response from the server +``` + +Voila! +Now we can at least see what the response code from the server was. + +Of course, we can write much more sophisticated plugins that output much more data, but for the purpose of this exercise, that is enough. + +## Summary + +Fuzzing is a really powerful testing technique that goes way beyond unit testing. + +Fuzzing can be extremely useful for testing HTTP servers by substituting parts of valid HTTP requests with data that could potentially expose vulnerabilities or deficiencies in our server. + +There are many tools that can help us in fuzzy testing our web applications, both free and paid ones. + +## Resources + +[OWASP: Fuzzing](https://owasp.org/www-community/Fuzzing) + +[OWASP: Fuzz Vectors](https://owasp.org/www-project-web-security-testing-guide/v41/6-Appendix/C-Fuzz_Vectors) + +[Hacking HTTP with HTTPfuzz](https://medium.com/swlh/hacking-http-with-httpfuzz-67cfd061b616) + +[Fuzzing the Stack for Fun and Profit at DefCamp 2019](https://www.youtube.com/watch?v=qCMfrbpuCBk&list=PLnwq8gv9MEKiUOgrM7wble1YRsrqRzHKq&index=33) + +[HTTP Fuzzing Scan with SmartBear](https://support.smartbear.com/readyapi/docs/security/scans/types/fuzzing-http.html) + +[Fuzzing Session: Finding Bugs and Vulnerabilities Automatically](https://youtu.be/DSJePjhBN5E) + +[Fuzzing the CNCF Landscape](https://youtu.be/zIyIZxAZLzo) diff --git a/2023/day18.md b/2023/day18.md index e442aba..8a9fc53 100644 --- a/2023/day18.md +++ b/2023/day18.md @@ -1,33 +1,26 @@ -# IAST (Interactive Application Security Testing) +# DAST +DAST, or Dynamic Application Security Testing, is a technique that is used to evaluate the security of an application by simulating attacks from external sources. +Idea is to automate as much as possible black-box penetration testing. +It can be used for acquiring the low-hanging fruits so a real human’s time will be spared and additionally for generating traffic to other security tools (e.g. IAST). -IAST is a type of security testing tool that is designed to identify vulnerabilities in web applications and help developers fix them. It works by injecting a small agent into the application's runtime environment and monitoring its behavior in real-time. This allows IAST tools to identify vulnerabilities as they occur, rather than relying on static analysis or simulated attacks. +Nevertheless, It is an essential component of the SSDLC, as it helps organizations uncover potential vulnerabilities early in the development process, before the application is deployed to production. By conducting DAST testing, organizations can prevent security incidents and protect their data and assets from being compromised by attackers. -IAST works through software instrumentation, or the use of instruments to monitor an application as it runs and gather information about what it does and how it performs. IAST solutions instrument applications by deploying agents and sensors in running applications and continuously analyzing all application interactions initiated by manual tests, automated tests, or a combination of both to identify vulnerabilities in real time Instrumentation. -IAST agent is running inside the application and monitor for known attack patterns. As it is part of the application, it can monitor traffic between different components (either as classic MVC deployments and in microservices deployment). +## Tools -## For IAST to be used, there are few prerequisites. -- Application should be instrumented (inject the agent). -- Traffic should be generated - via manual or automated tests. Another possible approach is via DAST tools (OWASP ZAP can be used for example). +There are various open-source tools available for conducting DAST, such as ZAP, Burp Suite, and Arachni. These tools can simulate different types of attacks on the application, such as SQL injection, cross-site scripting, and other common vulnerabilities. For example, if an application is vulnerable to SQL injection, a DAST tool can send a malicious SQL query to the application, such as ' OR 1=1 --, and evaluate its response to determine if it is vulnerable. If the application is vulnerable, it may return all records from the database, indicating that the SQL injection attack was successful. +As some of the tests could be quite invasive (for example it may include ‘DROP TABLE’ or something similar) or at least put a good amount of test data into the databases or even DOS the app, +__DAST tools should never run against a production environment!!!__ +All tools have the possibility for authentication into the application and this could lead to production credentials compromise. Also when run authenticated scans against the testing environment, use suitable roles (if RBAC model exists, for the application, of course), e.g. DAST shouldn’t use role that have the possibility to delete or modify other users because this way the whole environment can became unusable. +As with other testing methodologies it is necessary to analyze the scope, so not unneeded targets are scanned. -## Advantages -One of the main advantages of IAST tools is that they can provide detailed and accurate information about vulnerabilities and how to fix them. This can save developers a lot of time and effort, as they don't have to manually search for vulnerabilities or try to reproduce them in a testing environment. IAST tools can also identify vulnerabilities that might be missed by other testing methods, such as those that require user interaction or are triggered under certain conditions. Testing time depends on the tests used (as IAST is not a standalone system) and with faster tests (automated tests) can be include into CI/CD pipelines. It can be used to detect different kind of vulnerabilities and due to the nature of the tools (it looks for “real traffic only) false positives/negatives findings are relatively rear compared to other testing types. -IAST can be used in two flavors - as a typical testing tool and as real-time protection (it is called RAST in this case). Both work at the same principals and can be used together. +## Usage +Common error is scanning compensating security controls (e.g. WAF) instead of the real application. DAST is in its core an application security testing tool and should be used against actual applications, not against security mitigations. As it uses pretty standardized attacks, external controls can block the attacking traffic and this way to cover potentially exploitable flows (as per definition adversary would be able to eventually bypass such measures) +Actual scans are quite slow, so sometimes they should be run outside of the DevOps pipeline. Good example is running them nightly or during the weekend. Some of the simple tools (zap / arachny, …) could be used into pipelines but often, due to the nature of the scan can slow down the whole development process. +Once the DAST testing is complete, the results are analyzed to identify any vulnerabilities that were discovered. The organization can then take appropriate remediation steps to address the vulnerabilities and improve the overall security of the application. This may involve fixing the underlying code, implementing additional security controls, such as input validation and filtering, or both. +In conclusion, the use of DAST in the SSDLC is essential for ensuring the security of an application. By conducting DAST testing and identifying vulnerabilities early in the development process, organizations can prevent security incidents and protect their assets from potential threats. Open-source tools, such as ZAP, Burp Suite, and Arachni, can be used to conduct DAST testing and help organizations improve their overall security posture. +As with all other tools part of DevSecOps pipeline DAST should not be the only scanner in place and as with all others, it is not a substitute for penetration test and good development practices. -## There are several disadvantages of the technology as well: -- It is relatively new technology so there is not a lot of knowledge and experience both for the security teams and for the tools builders (open-source or commercial). -- The solution cannot be used alone - something (or someone) should generate traffic patterns. It is important that all possible endpoints are queried during the tests. -- Findings are based on traffic. This is especially true if used for testing alone - if there is no traffic to a portion of the app / site it would not be tested so no findings are going to be generated. -- Due to need of instrumentation of the app, it can be fairly complex, especially compared to the source scanning tools (SAST or SCA). - -There are several different IAST tools available, each with its own features and capabilities. -## Some common features of IAST tools include: -- Real-time monitoring: IAST tools monitor the application's behavior in real-time, allowing them to identify vulnerabilities as they occur. -- Vulnerability identification: IAST tools can identify a wide range of vulnerabilities, including injection attacks, cross-site scripting (XSS), and cross-site request forgery (CSRF). -- Remediation guidance: IAST tools often provide detailed information about how to fix identified vulnerabilities, including code snippets and recommendations for secure coding practices. -- Integration with other tools: IAST tools can often be integrated with other security testing tools, such as static code analysis or penetration testing tools, to provide a more comprehensive view of an application's security. - -IAST tools can be a valuable addition to a developer's toolkit, as they can help identify and fix vulnerabilities in real-time, saving time and effort. If you are a developer and are interested in using an IAST tool, there are many options available, so it is important to research and compare different tools to find the one that best fits your needs. - -## Tool example -There are almost no open-source tools on the market. Example is the commercial tool: Contrast Community Edition (CE) - Fully featured version for 1 app and up to 5 users (some Enterprise features disabled). Contrast CE supports Java and .NET only. -Can be found here - https://www.contrastsecurity.com/contrast-community-edition +## Some useful links and open-source tools: +- https://github.com/zaproxy/zaproxy +- https://www.arachni-scanner.com/ +- https://owasp.org/www-project-devsecops-guideline/latest/02b-Dynamic-Application-Security-Testing diff --git a/2023/day19.md b/2023/day19.md index e69de29..b13e84c 100644 --- a/2023/day19.md +++ b/2023/day19.md @@ -0,0 +1,33 @@ +# IAST (Interactive Application Security Testing) + +IAST is a type of security testing tool that is designed to identify vulnerabilities in web applications and help developers fix them. It works by injecting a small agent into the application's runtime environment and monitoring its behavior in real-time. This allows IAST tools to identify vulnerabilities as they occur, rather than relying on static analysis or simulated attacks. + +IAST works through software instrumentation, or the use of instruments to monitor an application as it runs and gather information about what it does and how it performs. IAST solutions instrument applications by deploying agents and sensors in running applications and continuously analyzing all application interactions initiated by manual tests, automated tests, or a combination of both to identify vulnerabilities in real time Instrumentation. +IAST agent is running inside the application and monitor for known attack patterns. As it is part of the application, it can monitor traffic between different components (either as classic MVC deployments and in microservices deployment). + +## For IAST to be used, there are few prerequisites. +- Application should be instrumented (inject the agent). +- Traffic should be generated - via manual or automated tests. Another possible approach is via DAST tools (OWASP ZAP can be used for example). + +## Advantages +One of the main advantages of IAST tools is that they can provide detailed and accurate information about vulnerabilities and how to fix them. This can save developers a lot of time and effort, as they don't have to manually search for vulnerabilities or try to reproduce them in a testing environment. IAST tools can also identify vulnerabilities that might be missed by other testing methods, such as those that require user interaction or are triggered under certain conditions. Testing time depends on the tests used (as IAST is not a standalone system) and with faster tests (automated tests) can be include into CI/CD pipelines. It can be used to detect different kind of vulnerabilities and due to the nature of the tools (it looks for “real traffic only) false positives/negatives findings are relatively rear compared to other testing types. +IAST can be used in two flavors - as a typical testing tool and as real-time protection (it is called RAST in this case). Both work at the same principals and can be used together. + +## There are several disadvantages of the technology as well: +- It is relatively new technology so there is not a lot of knowledge and experience both for the security teams and for the tools builders (open-source or commercial). +- The solution cannot be used alone - something (or someone) should generate traffic patterns. It is important that all possible endpoints are queried during the tests. +- Findings are based on traffic. This is especially true if used for testing alone - if there is no traffic to a portion of the app / site it would not be tested so no findings are going to be generated. +- Due to need of instrumentation of the app, it can be fairly complex, especially compared to the source scanning tools (SAST or SCA). + +There are several different IAST tools available, each with its own features and capabilities. +## Some common features of IAST tools include: +- Real-time monitoring: IAST tools monitor the application's behavior in real-time, allowing them to identify vulnerabilities as they occur. +- Vulnerability identification: IAST tools can identify a wide range of vulnerabilities, including injection attacks, cross-site scripting (XSS), and cross-site request forgery (CSRF). +- Remediation guidance: IAST tools often provide detailed information about how to fix identified vulnerabilities, including code snippets and recommendations for secure coding practices. +- Integration with other tools: IAST tools can often be integrated with other security testing tools, such as static code analysis or penetration testing tools, to provide a more comprehensive view of an application's security. + +IAST tools can be a valuable addition to a developer's toolkit, as they can help identify and fix vulnerabilities in real-time, saving time and effort. If you are a developer and are interested in using an IAST tool, there are many options available, so it is important to research and compare different tools to find the one that best fits your needs. + +## Tool example +There are almost no open-source tools on the market. Example is the commercial tool: Contrast Community Edition (CE) - Fully featured version for 1 app and up to 5 users (some Enterprise features disabled). Contrast CE supports Java and .NET only. +Can be found here - https://www.contrastsecurity.com/contrast-community-edition diff --git a/2023/day20.md b/2023/day20.md index e69de29..3e37c7f 100644 --- a/2023/day20.md +++ b/2023/day20.md @@ -0,0 +1,10 @@ +# IAST and DAST in conjunction - lab time + +1. As there are no open-source IAST implementation will use a commercial one with some free licenses. For this purpose, you will need 2 componenets: +IAST solution from here - https://github.com/rstatsinger/contrast-java-webgoat-docker . You need docker and docker-compose installed in mac or linux enviroment (this lab is tested on Mint). Please follow the README to create account in Contrast. +2. For running the IAST there are few ways to do it- manually via a DAST scanner, ... +- Easiest way to do it is to use ZAP proxy. For this purpose install ZAP from here - https://www.zaproxy.org/download/ +- Install zap-cli - https://github.com/Grunny/zap-cli +- Run ZAP proxy (from installed location, in Mint it is by default in /opt/zaproxy) +- Set env variables for ZAP_API_KEY and ZAP_PORT +- Run several commands with zap cli. For example: zap-cli quick-scan -s all --ajax-spider -r http://127.0.0.1:8080/WebGoat/login.mvc . You should see some results in contrast UI. diff --git a/2023/images/day06-1.png b/2023/images/day06-1.png new file mode 100644 index 0000000..1df9956 Binary files /dev/null and b/2023/images/day06-1.png differ diff --git a/2023/images/day06-2.png b/2023/images/day06-2.png new file mode 100644 index 0000000..e1ff7fc Binary files /dev/null and b/2023/images/day06-2.png differ diff --git a/2023/images/day06-3.png b/2023/images/day06-3.png new file mode 100644 index 0000000..d7bed0c Binary files /dev/null and b/2023/images/day06-3.png differ diff --git a/2023/images/day06-4.png b/2023/images/day06-4.png new file mode 100644 index 0000000..394f4db Binary files /dev/null and b/2023/images/day06-4.png differ diff --git a/2023/images/day06-5.png b/2023/images/day06-5.png new file mode 100644 index 0000000..38d4780 Binary files /dev/null and b/2023/images/day06-5.png differ diff --git a/2023/images/day09-1.png b/2023/images/day09-1.png new file mode 100644 index 0000000..bc16722 Binary files /dev/null and b/2023/images/day09-1.png differ diff --git a/2023/images/day09-10.png b/2023/images/day09-10.png new file mode 100644 index 0000000..081fe90 Binary files /dev/null and b/2023/images/day09-10.png differ diff --git a/2023/images/day09-11.png b/2023/images/day09-11.png new file mode 100644 index 0000000..28832b0 Binary files /dev/null and b/2023/images/day09-11.png differ diff --git a/2023/images/day09-12.png b/2023/images/day09-12.png new file mode 100644 index 0000000..88f67c6 Binary files /dev/null and b/2023/images/day09-12.png differ diff --git a/2023/images/day09-13.png b/2023/images/day09-13.png new file mode 100644 index 0000000..680dca8 Binary files /dev/null and b/2023/images/day09-13.png differ diff --git a/2023/images/day09-14.png b/2023/images/day09-14.png new file mode 100644 index 0000000..673e445 Binary files /dev/null and b/2023/images/day09-14.png differ diff --git a/2023/images/day09-15.png b/2023/images/day09-15.png new file mode 100644 index 0000000..3b1f54f Binary files /dev/null and b/2023/images/day09-15.png differ diff --git a/2023/images/day09-16.png b/2023/images/day09-16.png new file mode 100644 index 0000000..851dc4f Binary files /dev/null and b/2023/images/day09-16.png differ diff --git a/2023/images/day09-17.png b/2023/images/day09-17.png new file mode 100644 index 0000000..4001a02 Binary files /dev/null and b/2023/images/day09-17.png differ diff --git a/2023/images/day09-18.png b/2023/images/day09-18.png new file mode 100644 index 0000000..e172182 Binary files /dev/null and b/2023/images/day09-18.png differ diff --git a/2023/images/day09-19.png b/2023/images/day09-19.png new file mode 100644 index 0000000..e36f2d1 Binary files /dev/null and b/2023/images/day09-19.png differ diff --git a/2023/images/day09-2.png b/2023/images/day09-2.png new file mode 100644 index 0000000..26592f0 Binary files /dev/null and b/2023/images/day09-2.png differ diff --git a/2023/images/day09-3.png b/2023/images/day09-3.png new file mode 100644 index 0000000..4a01006 Binary files /dev/null and b/2023/images/day09-3.png differ diff --git a/2023/images/day09-4.png b/2023/images/day09-4.png new file mode 100644 index 0000000..4cbac55 Binary files /dev/null and b/2023/images/day09-4.png differ diff --git a/2023/images/day09-5.png b/2023/images/day09-5.png new file mode 100644 index 0000000..f45b0af Binary files /dev/null and b/2023/images/day09-5.png differ diff --git a/2023/images/day09-6.png b/2023/images/day09-6.png new file mode 100644 index 0000000..fee08e7 Binary files /dev/null and b/2023/images/day09-6.png differ diff --git a/2023/images/day09-7.png b/2023/images/day09-7.png new file mode 100644 index 0000000..5d74a51 Binary files /dev/null and b/2023/images/day09-7.png differ diff --git a/2023/images/day09-8.png b/2023/images/day09-8.png new file mode 100644 index 0000000..0e6591b Binary files /dev/null and b/2023/images/day09-8.png differ diff --git a/2023/images/day11-1.png b/2023/images/day11-1.png new file mode 100644 index 0000000..05f299b Binary files /dev/null and b/2023/images/day11-1.png differ diff --git a/2023/images/day13-1.png b/2023/images/day13-1.png new file mode 100644 index 0000000..2faf17d Binary files /dev/null and b/2023/images/day13-1.png differ diff --git a/Contributors.md b/Contributors.md index 25226ba..2d8cdf0 100644 --- a/Contributors.md +++ b/Contributors.md @@ -517,4 +517,4 @@ Contributors - + \ No newline at end of file diff --git a/README.md b/README.md index ae2993f..c9e4bc4 100644 --- a/README.md +++ b/README.md @@ -32,6 +32,8 @@ The two images below will take you to the 2022 and 2023 edition of the learning

+From this year we have built website for 90DaysOfDevops Challenge :rocket: :technologist: - [Link for website](https://www.90daysofdevops.com/#/2023) + The quickest way to get in touch is going to be via Twitter, my handle is [@MichaelCade1](https://twitter.com/MichaelCade1) diff --git a/index.html b/index.html index 9701dbf..9e79507 100644 --- a/index.html +++ b/index.html @@ -3,9 +3,10 @@ - Document + 90DaysOfDevOps - + diff --git a/template_repository/day01.md b/template_repository/day01.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day02.md b/template_repository/day02.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day03.md b/template_repository/day03.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day04.md b/template_repository/day04.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day05.md b/template_repository/day05.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day06.md b/template_repository/day06.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day07.md b/template_repository/day07.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day08.md b/template_repository/day08.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day09.md b/template_repository/day09.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day10.md b/template_repository/day10.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day11.md b/template_repository/day11.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day12.md b/template_repository/day12.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day13.md b/template_repository/day13.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day14.md b/template_repository/day14.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day15.md b/template_repository/day15.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day16.md b/template_repository/day16.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day17.md b/template_repository/day17.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day18.md b/template_repository/day18.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day19.md b/template_repository/day19.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day20.md b/template_repository/day20.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day21.md b/template_repository/day21.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day22.md b/template_repository/day22.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day23.md b/template_repository/day23.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day24.md b/template_repository/day24.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day25.md b/template_repository/day25.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day26.md b/template_repository/day26.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day27.md b/template_repository/day27.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day28.md b/template_repository/day28.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day29.md b/template_repository/day29.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day30.md b/template_repository/day30.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day31.md b/template_repository/day31.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day32.md b/template_repository/day32.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day33.md b/template_repository/day33.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day34.md b/template_repository/day34.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day35.md b/template_repository/day35.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day36.md b/template_repository/day36.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day37.md b/template_repository/day37.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day38.md b/template_repository/day38.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day39.md b/template_repository/day39.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day40.md b/template_repository/day40.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day41.md b/template_repository/day41.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day42.md b/template_repository/day42.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day43.md b/template_repository/day43.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day44.md b/template_repository/day44.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day45.md b/template_repository/day45.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day46.md b/template_repository/day46.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day47.md b/template_repository/day47.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day48.md b/template_repository/day48.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day49.md b/template_repository/day49.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day50.md b/template_repository/day50.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day51.md b/template_repository/day51.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day52.md b/template_repository/day52.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day53.md b/template_repository/day53.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day54.md b/template_repository/day54.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day55.md b/template_repository/day55.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day56.md b/template_repository/day56.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day57.md b/template_repository/day57.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day58.md b/template_repository/day58.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day59.md b/template_repository/day59.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day60.md b/template_repository/day60.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day61.md b/template_repository/day61.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day62.md b/template_repository/day62.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day63.md b/template_repository/day63.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day64.md b/template_repository/day64.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day65.md b/template_repository/day65.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day66.md b/template_repository/day66.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day67.md b/template_repository/day67.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day68.md b/template_repository/day68.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day69.md b/template_repository/day69.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day70.md b/template_repository/day70.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day71.md b/template_repository/day71.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day72.md b/template_repository/day72.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day73.md b/template_repository/day73.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day74.md b/template_repository/day74.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day75.md b/template_repository/day75.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day76.md b/template_repository/day76.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day77.md b/template_repository/day77.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day78.md b/template_repository/day78.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day79.md b/template_repository/day79.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day80.md b/template_repository/day80.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day81.md b/template_repository/day81.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day82.md b/template_repository/day82.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day83.md b/template_repository/day83.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day84.md b/template_repository/day84.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day85.md b/template_repository/day85.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day86.md b/template_repository/day86.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day87.md b/template_repository/day87.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day88.md b/template_repository/day88.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day89.md b/template_repository/day89.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/day90.md b/template_repository/day90.md new file mode 100644 index 0000000..e69de29 diff --git a/template_repository/readme.md b/template_repository/readme.md new file mode 100644 index 0000000..2968176 --- /dev/null +++ b/template_repository/readme.md @@ -0,0 +1,132 @@ +# 90DaysOfDevOps +## Progress + +- [] ♾️ 1 > [](day01.md) + +### DevSecOps + +- [] ♾️ 2 > [](day02.md) +- [] ♾️ 3 > [](day03.md) +- [] ♾️ 4 > [](day04.md) +- [] ♾️ 5 > [](day05.md) +- [] ♾️ 6 > [](day06.md) + +### Secure Coding + +- [] ⌨️ 7 > [](day07.md) +- [] ⌨️ 8 > [](day08.md) +- [] ⌨️ 9 > [](day09.md) +- [] ⌨️ 10 > [](day10.md) +- [] ⌨️ 11 > [](day11.md) +- [] ⌨️ 12 > [](day12.md) +- [] ⌨️ 13 > [](day13.md) + +### Continuous Build, Integration, Testing + +- [] 🐧 14 > [](day14.md) +- [] 🐧 15 > [](day15.md) +- [] 🐧 16 > [](day16.md) +- [] 🐧 17 > [](day17.md) +- [] 🐧 18 > [](day18.md) +- [] 🐧 19 > [](day19.md) +- [] 🐧 20 > [](day20.md) + +### Continuous Delivery & Deployment + +- [] 🌐 21 > [](day21.md) +- [] 🌐 22 > [](day22.md) +- [] 🌐 23 > [](day23.md) +- [] 🌐 24 > [](day24.md) +- [] 🌐 25 > [](day25.md) +- [] 🌐 26 > [](day26.md) +- [] 🌐 27 > [](day27.md) + +### Runtime Defence & Monitoring + +- [] ☁️ 28 > [](day28.md) +- [] ☁️ 29 > [](day29.md) +- [] ☁️ 30 > [](day30.md) +- [] ☁️ 31 > [](day31.md) +- [] ☁️ 32 > [](day32.md) +- [] ☁️ 33 > [](day33.md) +- [] ☁️ 34 > [](day34.md) + +### Secrets Management + +- [] 📚 35 > [](day35.md) +- [] 📚 36 > [](day36.md) +- [] 📚 37 > [](day37.md) +- [] 📚 38 > [](day38.md) +- [] 📚 39 > [](day39.md) +- [] 📚 40 > [](day40.md) +- [] 📚 41 > [](day41.md) + +### Python + +- [] 🏗️ 42 > [](day42.md) +- [] 🏗️ 43 > [](day43.md) +- [] 🏗️ 44 > [](day44.md) +- [] 🏗️ 45 > [](day45.md) +- [] 🏗️ 46 > [](day46.md) +- [] 🏗️ 47 > [](day47.md) +- [] 🏗️ 48 > [](day48.md) + +### AWS + +- [] ☸ 49 > [](day49.md) +- [] ☸ 50 > [](day50.md) +- [] ☸ 51 > [](day51.md) +- [] ☸ 52 > [](day52.md) +- [] ☸ 53 > [](day53.md) +- [] ☸ 54 > [](day54.md) +- [] ☸ 55 > [](day55.md) + +### OpenShift + +- [] 🤖 56 > [](day56.md) +- [] 🤖 57 > [](day57.md) +- [] 🤖 58 > [](day58.md) +- [] 🤖 59 > [](day59.md) +- [] 🤖 60 > [](day60.md) +- [] 🤖 61 > [](day61.md) +- [] 🤖 62 > [](day62.md) + +### Databases + +- [] 📜 63 > [](day63.md) +- [] 📜 64 > [](day64.md) +- [] 📜 65 > [](day65.md) +- [] 📜 66 > [](day66.md) +- [] 📜 67 > [](day67.md) +- [] 📜 68 > [](day68.md) +- [] 📜 69 > [](day69.md) + +### Serverless + +- [] 🔄 70 > [](day70.md) +- [] 🔄 71 > [](day71.md) +- [] 🔄 72 > [](day72.md) +- [] 🔄 73 > [](day73.md) +- [] 🔄 74 > [](day74.md) +- [] 🔄 75 > [](day75.md) +- [] 🔄 76 > [](day76.md) + +### Service Mesh + +- [] 📈 77 > [](day77.md) +- [] 📈 78 > [](day78.md) +- [] 📈 79 > [](day79.md) +- [] 📈 80 > [](day80.md) +- [] 📈 81 > [](day81.md) +- [] 📈 82 > [](day82.md) +- [] 📈 83 > [](day83.md) + +### Engineering for Day 2 Ops + +- [] 🗃️ 84 > [](day84.md) +- [] 🗃️ 85 > [](day85.md) +- [] 🗃️ 86 > [](day86.md) +- [] 🗃️ 87 > [](day87.md) +- [] 🗃️ 88 > [](day88.md) +- [] 🗃️ 89 > [](day89.md) +- [] 🗃️ 90 > [](day90.md)