🧪 Make pytest notify us about future warnings

In essence, this configures Python to turn any warnings emitted in
runtime into errors[[1]]. This is the best practice that allows
reacting to future deprecation announcements that are coming from the
dependencies (direct, or transitive, or even CPython itself)[[2]].

The typical workflow looks like this:

  1. If a dependency is updated an a warning is hit in tests, the
     deprecated thing should be replaced with newer APIs.

  2. If a dependency is transitive or we have no control over it
     otherwise, the specific warning and a regex matching its message,
     plus the module reference (where possible) can be added to the
     list of temporary ignores in `pytest.ini`.

  3. The list of temporary ignores should be reevaluated periodically,
     including when dependency re-pinning in lockfile is happening.

[1]: https://docs.python.org/3/using/cmdline.html#cmdoption-W
[2]: https://pytest-with-eric.com/configuration/pytest-ignore-warnings/
This commit is contained in:
Sviatoslav Sydorenko
2024-11-07 13:19:08 +01:00
committed by Alan Rominger
parent 4bbcb34ae3
commit d8e87da898
6 changed files with 117 additions and 0 deletions

View File

@@ -8,6 +8,12 @@ from awx.main.models import Instance
from django.test.utils import override_settings
@pytest.mark.filterwarnings(
# FIXME: Figure out where it is emited and what causes it.
# FIXME: The suppression should be made more specific or the cause fixed.
# Ref: https://github.com/ansible/awx/pull/15620
"ignore::RuntimeWarning",
)
@pytest.mark.django_db
def test_peers_adding_and_removing(run_module, admin_user):
with override_settings(IS_K8S=True):

View File

@@ -7,6 +7,12 @@ import pytest
from awx.main.models import InstanceGroup, Instance
@pytest.mark.filterwarnings(
# FIXME: Figure out where it is emited and what causes it.
# FIXME: The suppression should be made more specific or the cause fixed.
# Ref: https://github.com/ansible/awx/pull/15620
"ignore::RuntimeWarning",
)
@pytest.mark.django_db
def test_instance_group_create(run_module, admin_user):
result = run_module(
@@ -36,6 +42,12 @@ def test_instance_group_create(run_module, admin_user):
assert len(all_instance_names) == 1, 'Too many instances in group {0}'.format(','.join(all_instance_names))
@pytest.mark.filterwarnings(
# FIXME: Figure out where it is emited and what causes it.
# FIXME: The suppression should be made more specific or the cause fixed.
# Ref: https://github.com/ansible/awx/pull/15620
"ignore::RuntimeWarning",
)
@pytest.mark.django_db
def test_container_group_create(run_module, admin_user, kube_credential):
pod_spec = "{ 'Nothing': True }"

View File

@@ -8,6 +8,14 @@ import pytest
from awx.main.models import WorkflowJobTemplateNode, WorkflowJobTemplate, JobTemplate, UnifiedJobTemplate
pytestmark = pytest.mark.filterwarnings(
# FIXME: Figure out where it is emited and what causes it.
# FIXME: The suppression should be made more specific or the cause fixed.
# Ref: https://github.com/ansible/awx/pull/15620
"ignore::RuntimeWarning",
)
@pytest.fixture
def job_template(project, inventory):
return JobTemplate.objects.create(