Skip to content

Commit

Permalink
Merge branch 'main' into update-sql-injection-algo-to-tokenizer
Browse files Browse the repository at this point in the history
  • Loading branch information
Wout Feys committed Nov 26, 2024
2 parents 00fbb56 + d3e2c7d commit cbf04e3
Show file tree
Hide file tree
Showing 176 changed files with 3,779 additions and 1,604 deletions.
2 changes: 1 addition & 1 deletion .github/SECURITY.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Reporting Security Issues

The Aikido team and community take security bugs in firewall seriously. We appreciate your efforts to responsibly disclose your findings, and will make every effort to acknowledge your contributions.
The Aikido team and community take security bugs in Zen seriously. We appreciate your efforts to responsibly disclose your findings, and will make every effort to acknowledge your contributions.

To report a security issue, register on Intigriti and navigate to https://app.intigriti.com/researcher/programs/aikido/aikidoruntime.

Expand Down
38 changes: 36 additions & 2 deletions .github/workflows/benchmark.yml
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ on:
push: {}
workflow_call: {}
jobs:
benchmark:
benchmark_sql_algorithm:
runs-on: ubuntu-latest
timeout-minutes: 10
steps:
Expand All @@ -30,12 +30,46 @@ jobs:
run: |
poetry run python ./benchmarks/sql_benchmark/sql_benchmark_fw.py
poetry run python ./benchmarks/sql_benchmark/sql_benchmark_no_fw.py
- name: Run Docker Compose
benchmark_with_flask_mysql:
runs-on: ubuntu-latest
timeout-minutes: 10
steps:
- name: Checkout code
uses: actions/checkout@v2
- name: Start databases
working-directory: ./sample-apps/databases
run: docker compose up --build -d
- name: Start flask-mysql
working-directory: ./sample-apps/flask-mysql
run: |
cp .env.benchmark .env.example
docker compose -f docker-compose.yml -f docker-compose.benchmark.yml up --build -d
- name: Install K6
uses: grafana/setup-k6-action@v1
- name: Run flask-mysql k6 Benchmark
run: |
k6 run -q ./benchmarks/flask-mysql-benchmarks.js
benchmark_with_starlette_app:
runs-on: ubuntu-latest
timeout-minutes: 10
steps:
- name: Checkout code
uses: actions/checkout@v2
- name: Start databases
working-directory: ./sample-apps/databases
run: docker compose up --build -d
- name: Start starlette multi-threaded
working-directory: ./sample-apps/starlette-postgres-uvicorn
run: |
cp .env.benchmark .env.example
docker compose -f docker-compose.yml -f docker-compose.benchmark.yml up --build -d
- name: Install wrk
run: |
sudo apt-get update
sudo apt-get install -y wrk
- name: Set up Python 3.9
uses: actions/setup-python@v2
with:
python-version: 3.9
- name: Run benchmark
run: python ./benchmarks/starlette_benchmark.py
87 changes: 75 additions & 12 deletions .github/workflows/end2end.yml
Original file line number Diff line number Diff line change
Expand Up @@ -8,58 +8,121 @@ jobs:
steps:
- name: Checkout code
uses: actions/checkout@v2

- name: Start databases
working-directory: ./sample-apps/databases
run: docker compose up --build -d
- name: Build and start aikido mock server
working-directory: ./end2end/server
run: docker build -t mock-core . && docker run --name mock_core -d -p 5000:5000 mock-core

- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v2
with:
python-version: ${{ matrix.python-version }}
- name: Install dependencies
run: |
python -m pip install --upgrade pip
make install
# django-mysql
- name: Start django-mysql
working-directory: ./sample-apps/django-mysql
run: |
docker compose -f docker-compose.yml -f docker-compose.benchmark.yml up --build -d
- name: Run end2end tests for django-mysql
run: sleep 5 && poetry run pytest ./end2end/django_mysql_test.py

# django-mysql-gunicorn
- name: Restart mock server
run: docker restart mock_core
- name: Start django-mysql-gunicorn
working-directory: ./sample-apps/django-mysql-gunicorn
run: |
docker compose -f docker-compose.yml -f docker-compose.benchmark.yml up --build -d
- name: Run end2end tests for django-mysql-gunicorn
run: sleep 5 && poetry run pytest ./end2end/django_mysql_gunicorn_test.py

# django-postgres-gunicorn
- name: Restart mock server
run: docker restart mock_core
- name: Start django-postgres-gunicorn
working-directory: ./sample-apps/django-postgres-gunicorn
run: |
docker compose -f docker-compose.yml -f docker-compose.benchmark.yml up --build -d
- name: Run end2end tests for django-postgres-gunicorn
run: sleep 5 && poetry run pytest ./end2end/django_postgres_gunicorn_test.py

# flask-mongo
- name: Restart mock server
run: docker restart mock_core
- name: Start flask-mongo
working-directory: ./sample-apps/flask-mongo
run: |
docker compose -f docker-compose.yml -f docker-compose.benchmark.yml up --build -d
- name: Run end2end tests for flask-mongo
run: sleep 5 && poetry run pytest ./end2end/flask_mongo_test.py

# flask-mysql
- name: Restart mock server
run: docker restart mock_core
- name: Start flask-mysql
working-directory: ./sample-apps/flask-mysql
run: |
docker compose -f docker-compose.yml -f docker-compose.benchmark.yml up --build -d
- name: Run end2end tests for flask-mysql
run: sleep 5 && poetry run pytest ./end2end/flask_mysql_test.py

# flask-mysql-uwsgi
- name: Restart mock server
run: docker restart mock_core
- name: Start flask-mysql-uwsgi
working-directory: ./sample-apps/flask-mysql-uwsgi
run: |
docker compose -f docker-compose.yml -f docker-compose.benchmark.yml up --build -d
- name: Run end2end tests for flask-mysql-uwsgi
run: sleep 5 && poetry run pytest ./end2end/flask_mysql_uwsgi_test.py

# flask-postgres
- name: Restart mock server
run: docker restart mock_core
- name: Start flask-postgres
working-directory: ./sample-apps/flask-postgres
run: |
docker compose -f docker-compose.yml -f docker-compose.benchmark.yml up --build -d
- name: Run end2end tests for flask-postgres
run: sleep 5 && poetry run pytest ./end2end/flask_postgres_test.py

# flask-postgres-xml
- name: Restart mock server
run: docker restart mock_core
- name: Start flask-postgres-xml
working-directory: ./sample-apps/flask-postgres-xml
run: |
docker compose -f docker-compose.yml -f docker-compose.benchmark.yml up --build -d
- name: Run end2end tests for flask-postgres-xml
run: |
sleep 5
poetry run pytest ./end2end/flask_postgres_xml_test.py
docker restart mock_core
poetry run pytest ./end2end/flask_postgres_xml_lxml_test.py
# quart-postgres-uvicorn
- name: Restart mock server
run: docker restart mock_core
- name: Start quart-postgres-uvicorn
working-directory: ./sample-apps/quart-postgres-uvicorn
run: |
docker compose -f docker-compose.yml -f docker-compose.benchmark.yml up --build -d
- name: Run end2end tests for quart-postgres-uvicorn
run: sleep 5 && poetry run pytest ./end2end/quart_postgres_uvicorn_test.py

# starlette-postgres-uvicorn
- name: Restart mock server
run: docker restart mock_core
- name: Start starlette-postgres-uvicorn
working-directory: ./sample-apps/starlette-postgres-uvicorn
run: |
docker compose -f docker-compose.yml -f docker-compose.benchmark.yml up --build -d
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v2
with:
python-version: ${{ matrix.python-version }}

- name: Install dependencies
run: |
python -m pip install --upgrade pip
make install
- name: Run end2end tests
run: |
make end2end
- name: Run end2end tests for starlette-postgres-uvicorn
run: sleep 5 && poetry run pytest ./end2end/starlette_postgres_uvicorn_test.py
44 changes: 15 additions & 29 deletions .github/workflows/test-publish.yml
Original file line number Diff line number Diff line change
Expand Up @@ -40,42 +40,28 @@ jobs:
run: |
make test
build:
name: Build distribution 📦
name: Build distribution 📦 and Publish to TestPyPI
needs:
- tests
runs-on: ubuntu-latest

steps:
- uses: actions/checkout@v4
- name: Installation
run: make install
- name: Build distribution packages
run: poetry build
- name: Install poetry
run: pip install poetry
- name: Install dependencies
run: poetry install

- name: Publish to TestPyPI
env:
POETRY_HTTP_BASIC_PYPI_USERNAME: __token__
POETRY_HTTP_BASIC_PYPI_PASSWORD: ${{ secrets.TEST_PYPI_TOKEN }}
run: |
poetry config repositories.test-pypi https://test.pypi.org/legacy/
poetry config pypi-token.test-pypi ${{ secrets.TEST_PYPI_TOKEN }}
poetry publish -r test-pypi --build
- name: Store the distribution packages
uses: actions/upload-artifact@v3
with:
name: python-package-distributions
path: dist/
publish-to-testpypi:
name: Publish Python 🐍 distribution 📦 to TestPyPI
needs:
- build
- tests
runs-on: ubuntu-latest

environment:
name: testpypi
url: https://test.pypi.org/p/aikido_zen

permissions:
id-token: write # IMPORTANT: mandatory for trusted publishing

steps:
- name: Download all the dists
uses: actions/[email protected]
with:
name: python-package-distributions
path: dist/
- name: Publish distribution 📦 to TestPyPI
uses: pypa/gh-action-pypi-publish@release/v1
with:
repository-url: https://test.pypi.org/legacy/
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ Zen protects your Python apps by preventing user input containing dangerous stri
Zen will autonomously protect your Python applications from the inside against:

* 🛡️ [NoSQL injection attacks](https://www.aikido.dev/blog/web-application-security-vulnerabilities)
* 🛡️ [SQL injection attacks]([https://www.aikido.dev/blog/web-application-security-vulnerabilities](https://owasp.org/www-community/attacks/SQL_Injection))
* 🛡️ [SQL injection attacks](https://www.aikido.dev/blog/the-state-of-sql-injections)
* 🛡️ [Command injection attacks](https://owasp.org/www-community/attacks/Command_Injection)
* 🛡️ [Path traversal attacks](https://owasp.org/www-community/attacks/Path_Traversal)
* 🛡️ [Server-side request forgery (SSRF)](./docs/ssrf.md)
Expand Down
7 changes: 5 additions & 2 deletions aikido_zen/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,9 +15,9 @@
# Import background process
from aikido_zen.background_process import start_background_process

# Load environment variables and constants
# Load environment variables and constants
from aikido_zen.config import PKG_VERSION
from aikido_zen.helpers.aikido_disabled_flag_active import aikido_disabled_flag_active

load_dotenv()

Expand All @@ -30,6 +30,9 @@ def protect(mode="daemon"):
- daemon_disabled : This will import sinks/sources but won't start a background process
Protect user's application
"""
if aikido_disabled_flag_active():
# Do not run any aikido code when the disabled flag is on
return
if mode in ("daemon", "daemon_only"):
start_background_process()
if mode == "daemon_only":
Expand Down Expand Up @@ -67,4 +70,4 @@ def protect(mode="daemon"):
import aikido_zen.sinks.os_system
import aikido_zen.sinks.subprocess

logger.info("Aikido python firewall v%s starting.", PKG_VERSION)
logger.info("Zen by Aikido v%s starting.", PKG_VERSION)
3 changes: 1 addition & 2 deletions aikido_zen/api_discovery/README.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,2 @@
# Feature flag

This feature is currently disabled by default. Enable it by setting the environment variable `AIKIDO_FEATURE_COLLECT_API_SCHEMA` to `true`.
This feature is now on by default.
3 changes: 0 additions & 3 deletions aikido_zen/api_discovery/get_api_info.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,9 +10,6 @@
def get_api_info(context):
"""Generates an apispec based on the context passed along"""
try:
# Check if feature flag COLLECT_API_SCHEMA is enabled
if not is_feature_enabled("COLLECT_API_SCHEMA"):
return {}
body_info = get_body_info(context)
query_info = get_query_info(context)
auth_info = get_auth_types(context)
Expand Down
8 changes: 3 additions & 5 deletions aikido_zen/api_discovery/update_route_info.py
Original file line number Diff line number Diff line change
@@ -1,31 +1,29 @@
"""Exports update_route_info function"""

from aikido_zen.helpers.logging import logger
from .get_api_info import get_api_info
from .merge_data_schemas import merge_data_schemas
from .merge_auth_types import merge_auth_types

ANALYSIS_ON_FIRST_X_ROUTES = 20


def update_route_info(context, route):
def update_route_info(new_apispec, route):
"""
Checks if a route still needs to be updated (only analyzes first x routes),
and if so, updates route using update_api_info
"""
if route["hits"] <= ANALYSIS_ON_FIRST_X_ROUTES:
# Only analyze the first x routes for api discovery
route["apispec"] = update_api_info(context, route["apispec"])
route["apispec"] = update_api_info(new_apispec, route["apispec"])


def update_api_info(context, existing_apispec=None):
def update_api_info(new_apispec, existing_apispec=None):
"""
Merges two apispec objects into one, getting all properties from both schemas to capture optional properties.
If the body info is not defined, the existing body info is returned (if any).
If there is no existing body info, but the new body info is defined, the new body info is returned without merging.
"""
try:
new_apispec = get_api_info(context)
if not new_apispec:
return existing_apispec
if not existing_apispec:
Expand Down
6 changes: 3 additions & 3 deletions aikido_zen/background_process/aikido_background_process.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,6 @@

import multiprocessing.connection as con
import time
import os
import sched
import traceback
import sys
Expand All @@ -19,6 +18,7 @@
from aikido_zen.background_process.api.http_api_ratelimited import (
ReportingApiHTTPRatelimited,
)
from aikido_zen.helpers.urls.get_api_url import get_api_url
from .commands import process_incoming_command

EMPTY_QUEUE_INTERVAL = 5 # 5 seconds
Expand All @@ -34,7 +34,7 @@ class AikidoBackgroundProcess:
def __init__(self, address, key):
logger.debug("Background process started")
try:
listener = con.Listener(address, authkey=key)
listener = con.Listener(address, authkey=None)
except OSError:
logger.warning(
"Aikido listener may already be running on port %s, exiting", address[1]
Expand Down Expand Up @@ -74,7 +74,7 @@ def reporting_thread(self):
self.send_to_connection_manager(event_scheduler)

api = ReportingApiHTTPRatelimited(
"https://guard.aikido.dev/",
reporting_url=get_api_url(),
max_events_per_interval=100,
interval_in_ms=60 * 60 * 1000,
)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -71,8 +71,11 @@ def report_initial_stats(self):
This is run 1m after startup, and checks if we should send out
a preliminary heartbeat with some stats.
"""
should_report_initial_stats = not (
self.statistics.is_empty() or self.conf.received_any_stats
data_is_available = not (
self.statistics.is_empty() and len(self.routes.routes) <= 0
)
should_report_initial_stats = (
data_is_available and not self.conf.received_any_stats
)
if should_report_initial_stats:
self.send_heartbeat()
Expand Down
Loading

0 comments on commit cbf04e3

Please sign in to comment.