mirror of
https://github.com/ansible/awx.git
synced 2026-02-06 03:54:44 -03:30
Compare commits
45 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
67f1ab2237 | ||
|
|
71be8fadcb | ||
|
|
c41becec13 | ||
|
|
6446b627ad | ||
|
|
fcebd188a6 | ||
|
|
65771b7629 | ||
|
|
86a67abbce | ||
|
|
d555093325 | ||
|
|
95a099acc5 | ||
|
|
d1fc2702ec | ||
|
|
734899228b | ||
|
|
87f729c642 | ||
|
|
62fc3994fb | ||
|
|
0d097964be | ||
|
|
9f8b3948e1 | ||
|
|
1ce8240192 | ||
|
|
1bcfc8f28e | ||
|
|
71925de902 | ||
|
|
54057f1c80 | ||
|
|
ae388d943d | ||
|
|
2d310dc4e5 | ||
|
|
fe1a767f4f | ||
|
|
8c6581d80a | ||
|
|
33e445f4f6 | ||
|
|
9bcb60d9e0 | ||
|
|
40109d58c7 | ||
|
|
2ef3f5f9e8 | ||
|
|
389c4a3180 | ||
|
|
bee48671cd | ||
|
|
21f551f48a | ||
|
|
cbb019ed09 | ||
|
|
bf5dfdaba7 | ||
|
|
0f7f8af9b8 | ||
|
|
0237402390 | ||
|
|
84d7fa882d | ||
|
|
cd2fae3471 | ||
|
|
8be64145f9 | ||
|
|
23d28fb4c8 | ||
|
|
aeffd6f393 | ||
|
|
ab6b4bad03 | ||
|
|
769c253ac2 | ||
|
|
8031b3d402 | ||
|
|
df38650aee | ||
|
|
1e57c84383 | ||
|
|
2b0846e8a2 |
2
.github/ISSUE_TEMPLATE.md
vendored
2
.github/ISSUE_TEMPLATE.md
vendored
@@ -25,7 +25,7 @@ Instead use the bug or feature request.
|
||||
<!--- Pick one below and delete the rest: -->
|
||||
- Breaking Change
|
||||
- New or Enhanced Feature
|
||||
- Bug or Docs Fix
|
||||
- Bug, Docs Fix or other nominal change
|
||||
|
||||
|
||||
##### COMPONENT NAME
|
||||
|
||||
2
.github/PULL_REQUEST_TEMPLATE.md
vendored
2
.github/PULL_REQUEST_TEMPLATE.md
vendored
@@ -11,7 +11,7 @@ the change does.
|
||||
<!--- Pick one below and delete the rest: -->
|
||||
- Breaking Change
|
||||
- New or Enhanced Feature
|
||||
- Bug or Docs Fix
|
||||
- Bug, Docs Fix or other nominal change
|
||||
|
||||
##### COMPONENT NAME
|
||||
<!--- Name of the module/plugin/module/task -->
|
||||
|
||||
25
.github/triage_replies.md
vendored
25
.github/triage_replies.md
vendored
@@ -1,5 +1,5 @@
|
||||
## General
|
||||
- For the roundup of all the different mailing lists available from AWX, Ansible, and beyond visit: https://docs.ansible.com/ansible/latest/community/communication.html
|
||||
- For the roundup of all the different mailing lists available from AWX, Ansible, and beyond visit: https://docs.ansible.com/ansible/latest/community/communication.html
|
||||
- Hello, we think your question is answered in our FAQ. Does this: https://www.ansible.com/products/awx-project/faq cover your question?
|
||||
- You can find the latest documentation here: https://docs.ansible.com/automation-controller/latest/html/userguide/index.html
|
||||
|
||||
@@ -58,10 +58,10 @@ Thank you once again for this and your interest in AWX!
|
||||
## Common
|
||||
|
||||
### Give us more info
|
||||
- Hello, we'd love to help, but we need a little more information about the problem you're having. Screenshots, log outputs, or any reproducers would be very helpful.
|
||||
- Hello, we'd love to help, but we need a little more information about the problem you're having. Screenshots, log outputs, or any reproducers would be very helpful.
|
||||
|
||||
### Code of Conduct
|
||||
- Hello. Please keep in mind that Ansible adheres to a Code of Conduct in its community spaces. The spirit of the code of conduct is to be kind, and this is your friendly reminder to be so. Please see the full code of conduct here if you have questions: https://docs.ansible.com/ansible/latest/community/code_of_conduct.html
|
||||
- Hello. Please keep in mind that Ansible adheres to a Code of Conduct in its community spaces. The spirit of the code of conduct is to be kind, and this is your friendly reminder to be so. Please see the full code of conduct here if you have questions: https://docs.ansible.com/ansible/latest/community/code_of_conduct.html
|
||||
|
||||
### EE Contents / Community General
|
||||
- Hello. The awx-ee contains the collections and dependencies needed for supported AWX features to function. Anything beyond that (like the community.general package) will require you to build your own EE. For information on how to do that, see https://ansible-builder.readthedocs.io/en/stable/ \
|
||||
@@ -79,31 +79,34 @@ The Ansible Community is looking at building an EE that corresponds to all of th
|
||||
- Hello, we think your idea is good! Please consider contributing a PR for this following our contributing guidelines: https://github.com/ansible/awx/blob/devel/CONTRIBUTING.md
|
||||
|
||||
### Receptor
|
||||
- You can find the receptor docs here: https://receptor.readthedocs.io/en/latest/
|
||||
- You can find the receptor docs here: https://receptor.readthedocs.io/en/latest/
|
||||
- Hello, your issue seems related to receptor. Could you please open an issue in the receptor repository? https://github.com/ansible/receptor. Thanks!
|
||||
|
||||
### Ansible Engine not AWX
|
||||
- Hello, your question seems to be about Ansible development, not about AWX. Try asking on the Ansible-devel specific mailing list: https://groups.google.com/g/ansible-devel
|
||||
- Hello, your question seems to be about Ansible development, not about AWX. Try asking on the Ansible-devel specific mailing list: https://groups.google.com/g/ansible-devel
|
||||
- Hello, your question seems to be about using Ansible, not about AWX. https://groups.google.com/g/ansible-project is the best place to visit for user questions about Ansible. Thanks!
|
||||
|
||||
### Ansible Galaxy not AWX
|
||||
- Hey there. That sounds like an FAQ question. Did this: https://www.ansible.com/products/awx-project/faq cover your question?
|
||||
|
||||
### Contributing Guidelines
|
||||
- AWX: https://github.com/ansible/awx/blob/devel/CONTRIBUTING.md
|
||||
- AWX: https://github.com/ansible/awx/blob/devel/CONTRIBUTING.md
|
||||
- AWX-Operator: https://github.com/ansible/awx-operator/blob/devel/CONTRIBUTING.md
|
||||
|
||||
### Oracle AWX
|
||||
We'd be happy to help if you can reproduce this with AWX since we do not have Oracle's Linux Automation Manager. If you need help with this specific version of Oracles Linux Automation Manager you will need to contact your Oracle for support.
|
||||
|
||||
### AWX Release
|
||||
Subject: Announcing AWX X.Y.z
|
||||
Subject: Announcing AWX Xa.Ya.za and AWX-Operator Xb.Yb.zb
|
||||
|
||||
- Hi all, \
|
||||
\
|
||||
We're happy to announce that the next release of AWX, version <X> is now available! \
|
||||
In addition AWX Operator version <Y> has also been release! \
|
||||
We're happy to announce that the next release of AWX, version <b>`Xa.Ya.za`</b> is now available! \
|
||||
In addition AWX Operator version <b>`Xb.Yb.zb`</b> has also been released! \
|
||||
\
|
||||
Please see the releases pages for more details: \
|
||||
AWX: https://github.com/ansible/awx/releases/tag/<X> \
|
||||
Operator: https://github.com/ansible/awx-operator/releases/tag/<Y> \
|
||||
AWX: https://github.com/ansible/awx/releases/tag/Xa.Ya.za \
|
||||
Operator: https://github.com/ansible/awx-operator/releases/tag/Xb.Yb.zb \
|
||||
\
|
||||
The AWX team.
|
||||
|
||||
|
||||
11
.github/workflows/ci.yml
vendored
11
.github/workflows/ci.yml
vendored
@@ -111,9 +111,18 @@ jobs:
|
||||
repository: ansible/awx-operator
|
||||
path: awx-operator
|
||||
|
||||
- name: Get python version from Makefile
|
||||
working-directory: awx
|
||||
run: echo py_version=`make PYTHON_VERSION` >> $GITHUB_ENV
|
||||
|
||||
- name: Install python ${{ env.py_version }}
|
||||
uses: actions/setup-python@v2
|
||||
with:
|
||||
python-version: ${{ env.py_version }}
|
||||
|
||||
- name: Install playbook dependencies
|
||||
run: |
|
||||
python3 -m pip install docker setuptools_scm
|
||||
python3 -m pip install docker
|
||||
|
||||
- name: Build AWX image
|
||||
working-directory: awx
|
||||
|
||||
45
.github/workflows/pr_body_check.yml
vendored
Normal file
45
.github/workflows/pr_body_check.yml
vendored
Normal file
@@ -0,0 +1,45 @@
|
||||
---
|
||||
name: PR Check
|
||||
env:
|
||||
BRANCH: ${{ github.base_ref || 'devel' }}
|
||||
on:
|
||||
pull_request:
|
||||
types: [opened, edited, reopened, synchronize]
|
||||
jobs:
|
||||
pr-check:
|
||||
name: Scan PR description for semantic versioning keywords
|
||||
runs-on: ubuntu-latest
|
||||
permissions:
|
||||
packages: write
|
||||
contents: read
|
||||
steps:
|
||||
- name: Write PR body to a file
|
||||
run: |
|
||||
cat >> pr.body << __SOME_RANDOM_PR_EOF__
|
||||
${{ github.event.pull_request.body }}
|
||||
__SOME_RANDOM_PR_EOF__
|
||||
|
||||
- name: Display the received body for troubleshooting
|
||||
run: cat pr.body
|
||||
|
||||
# We want to write these out individually just incase the options were joined on a single line
|
||||
- name: Check for each of the lines
|
||||
run: |
|
||||
grep "Bug, Docs Fix or other nominal change" pr.body > Z
|
||||
grep "New or Enhanced Feature" pr.body > Y
|
||||
grep "Breaking Change" pr.body > X
|
||||
exit 0
|
||||
# We exit 0 and set the shell to prevent the returns from the greps from failing this step
|
||||
# See https://docs.github.com/en/actions/using-workflows/workflow-syntax-for-github-actions#exit-codes-and-error-action-preference
|
||||
shell: bash {0}
|
||||
|
||||
- name: Check for exactly one item
|
||||
run: |
|
||||
if [ $(cat X Y Z | wc -l) != 1 ] ; then
|
||||
echo "The PR body must contain exactly one of [ 'Bug, Docs Fix or other nominal change', 'New or Enhanced Feature', 'Breaking Change' ]"
|
||||
echo "We counted $(cat X Y Z | wc -l)"
|
||||
echo "See the default PR body for examples"
|
||||
exit 255;
|
||||
else
|
||||
exit 0;
|
||||
fi
|
||||
2
.github/workflows/stage.yml
vendored
2
.github/workflows/stage.yml
vendored
@@ -65,7 +65,7 @@ jobs:
|
||||
|
||||
- name: Install playbook dependencies
|
||||
run: |
|
||||
python3 -m pip install docker setuptools_scm
|
||||
python3 -m pip install docker
|
||||
|
||||
- name: Build and stage AWX
|
||||
working-directory: awx
|
||||
|
||||
@@ -19,16 +19,17 @@ Have questions about this document or anything not covered here? Come chat with
|
||||
- [Purging containers and images](#purging-containers-and-images)
|
||||
- [Pre commit hooks](#pre-commit-hooks)
|
||||
- [What should I work on?](#what-should-i-work-on)
|
||||
- [Translations](#translations)
|
||||
- [Submitting Pull Requests](#submitting-pull-requests)
|
||||
- [PR Checks run by Zuul](#pr-checks-run-by-zuul)
|
||||
- [Reporting Issues](#reporting-issues)
|
||||
- [Getting Help](#getting-help)
|
||||
|
||||
## Things to know prior to submitting code
|
||||
|
||||
- All code submissions are done through pull requests against the `devel` branch.
|
||||
- You must use `git commit --signoff` for any commit to be merged, and agree that usage of --signoff constitutes agreement with the terms of [DCO 1.1](./DCO_1_1.md).
|
||||
- Take care to make sure no merge commits are in the submission, and use `git rebase` vs `git merge` for this reason.
|
||||
- If collaborating with someone else on the same branch, consider using `--force-with-lease` instead of `--force`. This will prevent you from accidentally overwriting commits pushed by someone else. For more information, see https://git-scm.com/docs/git-push#git-push---force-with-leaseltrefnamegt
|
||||
- If collaborating with someone else on the same branch, consider using `--force-with-lease` instead of `--force`. This will prevent you from accidentally overwriting commits pushed by someone else. For more information, see [git push docs](https://git-scm.com/docs/git-push#git-push---force-with-leaseltrefnamegt).
|
||||
- If submitting a large code change, it's a good idea to join the `#ansible-awx` channel on irc.libera.chat, and talk about what you would like to do or add first. This not only helps everyone know what's going on, it also helps save time and effort, if the community decides some changes are needed.
|
||||
- We ask all of our community members and contributors to adhere to the [Ansible code of conduct](http://docs.ansible.com/ansible/latest/community/code_of_conduct.html). If you have questions, or need assistance, please reach out to our community team at [codeofconduct@ansible.com](mailto:codeofconduct@ansible.com)
|
||||
|
||||
@@ -42,8 +43,7 @@ The AWX development environment workflow and toolchain uses Docker and the docke
|
||||
|
||||
Prior to starting the development services, you'll need `docker` and `docker-compose`. On Linux, you can generally find these in your distro's packaging, but you may find that Docker themselves maintain a separate repo that tracks more closely to the latest releases.
|
||||
|
||||
For macOS and Windows, we recommend [Docker for Mac](https://www.docker.com/docker-mac) and [Docker for Windows](https://www.docker.com/docker-windows)
|
||||
respectively.
|
||||
For macOS and Windows, we recommend [Docker for Mac](https://www.docker.com/docker-mac) and [Docker for Windows](https://www.docker.com/docker-windows) respectively.
|
||||
|
||||
For Linux platforms, refer to the following from Docker:
|
||||
|
||||
@@ -79,17 +79,13 @@ See the [README.md](./tools/docker-compose/README.md) for docs on how to build t
|
||||
|
||||
### Building API Documentation
|
||||
|
||||
AWX includes support for building [Swagger/OpenAPI
|
||||
documentation](https://swagger.io). To build the documentation locally, run:
|
||||
AWX includes support for building [Swagger/OpenAPI documentation](https://swagger.io). To build the documentation locally, run:
|
||||
|
||||
```bash
|
||||
(container)/awx_devel$ make swagger
|
||||
```
|
||||
|
||||
This will write a file named `swagger.json` that contains the API specification
|
||||
in OpenAPI format. A variety of online tools are available for translating
|
||||
this data into more consumable formats (such as HTML). http://editor.swagger.io
|
||||
is an example of one such service.
|
||||
This will write a file named `swagger.json` that contains the API specification in OpenAPI format. A variety of online tools are available for translating this data into more consumable formats (such as HTML). http://editor.swagger.io is an example of one such service.
|
||||
|
||||
### Accessing the AWX web interface
|
||||
|
||||
@@ -115,20 +111,30 @@ While you can use environment variables to skip the pre-commit hooks GitHub will
|
||||
|
||||
## What should I work on?
|
||||
|
||||
We have a ["good first issue" label](https://github.com/ansible/awx/issues?q=is%3Aissue+is%3Aopen+label%3A%22good+first+issue%22) we put on some issues that might be a good starting point for new contributors.
|
||||
|
||||
Fixing bugs and updating the documentation are always appreciated, so reviewing the backlog of issues is always a good place to start.
|
||||
|
||||
For feature work, take a look at the current [Enhancements](https://github.com/ansible/awx/issues?q=is%3Aissue+is%3Aopen+label%3Atype%3Aenhancement).
|
||||
|
||||
If it has someone assigned to it then that person is the person responsible for working the enhancement. If you feel like you could contribute then reach out to that person.
|
||||
|
||||
Fixing bugs, adding translations, and updating the documentation are always appreciated, so reviewing the backlog of issues is always a good place to start. For extra information on debugging tools, see [Debugging](./docs/debugging/).
|
||||
**NOTES**
|
||||
|
||||
> Issue assignment will only be done for maintainers of the project. If you decide to work on an issue, please feel free to add a comment in the issue to let others know that you are working on it; but know that we will accept the first pull request from whomever is able to fix an issue. Once your PR is accepted we can add you as an assignee to an issue upon request.
|
||||
|
||||
**NOTE**
|
||||
|
||||
> If you work in a part of the codebase that is going through active development, your changes may be rejected, or you may be asked to `rebase`. A good idea before starting work is to have a discussion with us in the `#ansible-awx` channel on irc.libera.chat, or on the [mailing list](https://groups.google.com/forum/#!forum/awx-project).
|
||||
|
||||
**NOTE**
|
||||
|
||||
> If you're planning to develop features or fixes for the UI, please review the [UI Developer doc](./awx/ui/README.md).
|
||||
|
||||
### Translations
|
||||
|
||||
At this time we do not accept PRs for adding additional language translations as we have an automated process for generating our translations. This is because translations require constant care as new strings are added and changed in the code base. Because of this the .po files are overwritten during every translation release cycle. We also can't support a lot of translations on AWX as its an open source project and each language adds time and cost to maintain. If you would like to see AWX translated into a new language please create an issue and ask others you know to upvote the issue. Our translation team will review the needs of the community and see what they can do around supporting additional language.
|
||||
|
||||
If you find an issue with an existing translation, please see the [Reporting Issues](#reporting-issues) section to open an issue and our translation team will work with you on a resolution.
|
||||
|
||||
|
||||
## Submitting Pull Requests
|
||||
|
||||
Fixes and Features for AWX will go through the Github pull request process. Submit your pull request (PR) against the `devel` branch.
|
||||
@@ -152,28 +158,14 @@ We like to keep our commit history clean, and will require resubmission of pull
|
||||
|
||||
Sometimes it might take us a while to fully review your PR. We try to keep the `devel` branch in good working order, and so we review requests carefully. Please be patient.
|
||||
|
||||
All submitted PRs will have the linter and unit tests run against them via Zuul, and the status reported in the PR.
|
||||
|
||||
## PR Checks run by Zuul
|
||||
|
||||
Zuul jobs for awx are defined in the [zuul-jobs](https://github.com/ansible/zuul-jobs) repo.
|
||||
|
||||
Zuul runs the following checks that must pass:
|
||||
|
||||
1. `tox-awx-api-lint`
|
||||
2. `tox-awx-ui-lint`
|
||||
3. `tox-awx-api`
|
||||
4. `tox-awx-ui`
|
||||
5. `tox-awx-swagger`
|
||||
|
||||
Zuul runs the following checks that are non-voting (can not pass but serve to inform PR reviewers):
|
||||
|
||||
1. `tox-awx-detect-schema-change`
|
||||
This check generates the schema and diffs it against a reference copy of the `devel` version of the schema.
|
||||
Reviewers should inspect the `job-output.txt.gz` related to the check if their is a failure (grep for `diff -u -b` to find beginning of diff).
|
||||
If the schema change is expected and makes sense in relation to the changes made by the PR, then you are good to go!
|
||||
If not, the schema changes should be fixed, but this decision must be enforced by reviewers.
|
||||
When your PR is initially submitted the checks will not be run until a maintainer allows them to be. Once a maintainer has done a quick review of your work the PR will have the linter and unit tests run against them via GitHub Actions, and the status reported in the PR.
|
||||
|
||||
## Reporting Issues
|
||||
|
||||
|
||||
We welcome your feedback, and encourage you to file an issue when you run into a problem. But before opening a new issues, we ask that you please view our [Issues guide](./ISSUES.md).
|
||||
|
||||
## Getting Help
|
||||
|
||||
If you require additional assistance, please reach out to us at `#ansible-awx` on irc.libera.chat, or submit your question to the [mailing list](https://groups.google.com/forum/#!forum/awx-project).
|
||||
|
||||
For extra information on debugging tools, see [Debugging](./docs/debugging/).
|
||||
|
||||
@@ -232,6 +232,9 @@ class FieldLookupBackend(BaseFilterBackend):
|
||||
re.compile(value)
|
||||
except re.error as e:
|
||||
raise ValueError(e.args[0])
|
||||
elif new_lookup.endswith('__iexact'):
|
||||
if not isinstance(field, (CharField, TextField)):
|
||||
raise ValueError(f'{field.name} is not a text field and cannot be filtered by case-insensitive search')
|
||||
elif new_lookup.endswith('__search'):
|
||||
related_model = getattr(field, 'related_model', None)
|
||||
if not related_model:
|
||||
@@ -258,8 +261,8 @@ class FieldLookupBackend(BaseFilterBackend):
|
||||
search_filters = {}
|
||||
needs_distinct = False
|
||||
# Can only have two values: 'AND', 'OR'
|
||||
# If 'AND' is used, an iterm must satisfy all condition to show up in the results.
|
||||
# If 'OR' is used, an item just need to satisfy one condition to appear in results.
|
||||
# If 'AND' is used, an item must satisfy all conditions to show up in the results.
|
||||
# If 'OR' is used, an item just needs to satisfy one condition to appear in results.
|
||||
search_filter_relation = 'OR'
|
||||
for key, values in request.query_params.lists():
|
||||
if key in self.RESERVED_NAMES:
|
||||
|
||||
@@ -29,7 +29,6 @@ from django.utils.translation import gettext_lazy as _
|
||||
from django.utils.encoding import force_str
|
||||
from django.utils.text import capfirst
|
||||
from django.utils.timezone import now
|
||||
from django.utils.functional import cached_property
|
||||
|
||||
# Django REST Framework
|
||||
from rest_framework.exceptions import ValidationError, PermissionDenied
|
||||
@@ -5008,8 +5007,7 @@ class ActivityStreamSerializer(BaseSerializer):
|
||||
object_association = serializers.SerializerMethodField(help_text=_("When present, shows the field name of the role or relationship that changed."))
|
||||
object_type = serializers.SerializerMethodField(help_text=_("When present, shows the model on which the role or relationship was defined."))
|
||||
|
||||
@cached_property
|
||||
def _local_summarizable_fk_fields(self):
|
||||
def _local_summarizable_fk_fields(self, obj):
|
||||
summary_dict = copy.copy(SUMMARIZABLE_FK_FIELDS)
|
||||
# Special requests
|
||||
summary_dict['group'] = summary_dict['group'] + ('inventory_id',)
|
||||
@@ -5029,7 +5027,13 @@ class ActivityStreamSerializer(BaseSerializer):
|
||||
('workflow_approval', ('id', 'name', 'unified_job_id')),
|
||||
('instance', ('id', 'hostname')),
|
||||
]
|
||||
return field_list
|
||||
# Optimization - do not attempt to summarize all fields, pair down to only relations that exist
|
||||
if not obj:
|
||||
return field_list
|
||||
existing_association_types = [obj.object1, obj.object2]
|
||||
if 'user' in existing_association_types:
|
||||
existing_association_types.append('role')
|
||||
return [entry for entry in field_list if entry[0] in existing_association_types]
|
||||
|
||||
class Meta:
|
||||
model = ActivityStream
|
||||
@@ -5113,7 +5117,7 @@ class ActivityStreamSerializer(BaseSerializer):
|
||||
data = {}
|
||||
if obj.actor is not None:
|
||||
data['actor'] = self.reverse('api:user_detail', kwargs={'pk': obj.actor.pk})
|
||||
for fk, __ in self._local_summarizable_fk_fields:
|
||||
for fk, __ in self._local_summarizable_fk_fields(obj):
|
||||
if not hasattr(obj, fk):
|
||||
continue
|
||||
m2m_list = self._get_related_objects(obj, fk)
|
||||
@@ -5170,7 +5174,7 @@ class ActivityStreamSerializer(BaseSerializer):
|
||||
|
||||
def get_summary_fields(self, obj):
|
||||
summary_fields = OrderedDict()
|
||||
for fk, related_fields in self._local_summarizable_fk_fields:
|
||||
for fk, related_fields in self._local_summarizable_fk_fields(obj):
|
||||
try:
|
||||
if not hasattr(obj, fk):
|
||||
continue
|
||||
|
||||
@@ -1440,7 +1440,7 @@ msgstr "指定した認証情報は無効 (HTTP 401) です。"
|
||||
|
||||
#: awx/api/views/root.py:193 awx/api/views/root.py:234
|
||||
msgid "Unable to connect to proxy server."
|
||||
msgstr "プロキシサーバーに接続できません。"
|
||||
msgstr "プロキシーサーバーに接続できません。"
|
||||
|
||||
#: awx/api/views/root.py:195 awx/api/views/root.py:236
|
||||
msgid "Could not connect to subscription service."
|
||||
@@ -1976,7 +1976,7 @@ msgstr "リモートホスト名または IP を判別するために検索す
|
||||
|
||||
#: awx/main/conf.py:85
|
||||
msgid "Proxy IP Allowed List"
|
||||
msgstr "プロキシ IP 許可リスト"
|
||||
msgstr "プロキシー IP 許可リスト"
|
||||
|
||||
#: awx/main/conf.py:87
|
||||
msgid ""
|
||||
@@ -2198,7 +2198,7 @@ msgid ""
|
||||
"Follow symbolic links when scanning for playbooks. Be aware that setting "
|
||||
"this to True can lead to infinite recursion if a link points to a parent "
|
||||
"directory of itself."
|
||||
msgstr "Playbook をスキャンするときは、シンボリックリンクをたどってください。リンクがそれ自体の親ディレクトリーを指している場合は、これを True に設定すると、無限再帰が発生する可能性があることに注意してください。"
|
||||
msgstr "Playbook のスキャン時にシンボリックリンクをたどります。リンクが親ディレクトリーを参照している場合には、この設定を True に指定すると無限再帰が発生する可能性があります。"
|
||||
|
||||
#: awx/main/conf.py:337
|
||||
msgid "Ignore Ansible Galaxy SSL Certificate Verification"
|
||||
@@ -2499,7 +2499,7 @@ msgstr "Insights for Ansible Automation Platform の最終収集日。"
|
||||
msgid ""
|
||||
"Last gathered entries for expensive collectors for Insights for Ansible "
|
||||
"Automation Platform."
|
||||
msgstr "Insights for Ansible Automation Platform の高価なコレクターの最後に収集されたエントリー。"
|
||||
msgstr "Insights for Ansible Automation Platform でコストがかかっているコレクターに関して最後に収集されたエントリー"
|
||||
|
||||
#: awx/main/conf.py:686
|
||||
msgid "Insights for Ansible Automation Platform Gather Interval"
|
||||
@@ -3692,7 +3692,7 @@ msgstr "タスクの開始"
|
||||
|
||||
#: awx/main/models/events.py:189
|
||||
msgid "Variables Prompted"
|
||||
msgstr "変数のプロモート"
|
||||
msgstr "提示される変数"
|
||||
|
||||
#: awx/main/models/events.py:190
|
||||
msgid "Gathering Facts"
|
||||
@@ -3741,15 +3741,15 @@ msgstr "エラー"
|
||||
|
||||
#: awx/main/models/execution_environments.py:17
|
||||
msgid "Always pull container before running."
|
||||
msgstr "実行前に必ずコンテナーをプルしてください。"
|
||||
msgstr "実行前に必ずコンテナーをプルする"
|
||||
|
||||
#: awx/main/models/execution_environments.py:18
|
||||
msgid "Only pull the image if not present before running."
|
||||
msgstr "実行する前に、存在しない場合にのみイメージをプルしてください。"
|
||||
msgstr "イメージが存在しない場合のみ実行前にプルする"
|
||||
|
||||
#: awx/main/models/execution_environments.py:19
|
||||
msgid "Never pull container before running."
|
||||
msgstr "実行前にコンテナーをプルしないでください。"
|
||||
msgstr "実行前にコンテナーをプルしない"
|
||||
|
||||
#: awx/main/models/execution_environments.py:29
|
||||
msgid ""
|
||||
@@ -5228,7 +5228,7 @@ msgid ""
|
||||
"SSL) or \"ldaps://ldap.example.com:636\" (SSL). Multiple LDAP servers may be "
|
||||
"specified by separating with spaces or commas. LDAP authentication is "
|
||||
"disabled if this parameter is empty."
|
||||
msgstr "\"ldap://ldap.example.com:389\" (非 SSL) または \"ldaps://ldap.example.com:636\" (SSL) などの LDAP サーバーに接続する URI です。複数の LDAP サーバーをスペースまたはカンマで区切って指定できます。LDAP 認証は、このパラメーターが空の場合は無効になります。"
|
||||
msgstr "\"ldap://ldap.example.com:389\" (非 SSL) または \"ldaps://ldap.example.com:636\" (SSL) などの LDAP サーバーに接続する URI です。複数の LDAP サーバーをスペースまたはコンマで区切って指定できます。LDAP 認証は、このパラメーターが空の場合は無効になります。"
|
||||
|
||||
#: awx/sso/conf.py:170 awx/sso/conf.py:187 awx/sso/conf.py:198
|
||||
#: awx/sso/conf.py:209 awx/sso/conf.py:226 awx/sso/conf.py:244
|
||||
@@ -6236,4 +6236,5 @@ msgstr "%s が現在アップグレード中です。"
|
||||
|
||||
#: awx/ui/urls.py:24
|
||||
msgid "This page will refresh when complete."
|
||||
msgstr "このページは完了すると更新されます。"
|
||||
msgstr "このページは完了すると更新されます。"
|
||||
|
||||
|
||||
@@ -956,7 +956,7 @@ msgstr "인스턴스 그룹의 인스턴스"
|
||||
|
||||
#: awx/api/views/__init__.py:450
|
||||
msgid "Schedules"
|
||||
msgstr "일정"
|
||||
msgstr "스케줄"
|
||||
|
||||
#: awx/api/views/__init__.py:464
|
||||
msgid "Schedule Recurrence Rule Preview"
|
||||
@@ -3261,7 +3261,7 @@ msgstr "JSON 또는 YAML 구문을 사용하여 인젝터를 입력합니다.
|
||||
#: awx/main/models/credential/__init__.py:412
|
||||
#, python-format
|
||||
msgid "adding %s credential type"
|
||||
msgstr "인증 정보 유형 %s 추가 중"
|
||||
msgstr "인증 정보 유형 %s 추가 중"
|
||||
|
||||
#: awx/main/models/credential/__init__.py:590
|
||||
#: awx/main/models/credential/__init__.py:672
|
||||
@@ -6236,4 +6236,5 @@ msgstr "%s 현재 업그레이드 중입니다."
|
||||
|
||||
#: awx/ui/urls.py:24
|
||||
msgid "This page will refresh when complete."
|
||||
msgstr "완료되면 이 페이지가 새로 고침됩니다."
|
||||
msgstr "완료되면 이 페이지가 새로 고침됩니다."
|
||||
|
||||
|
||||
@@ -348,7 +348,7 @@ msgstr "SCM track_submodules 只能用于 git 项目。"
|
||||
msgid ""
|
||||
"Only Container Registry credentials can be associated with an Execution "
|
||||
"Environment"
|
||||
msgstr "只有容器 registry 凭证可以与执行环境关联"
|
||||
msgstr "只有容器注册表凭证才可以与执行环境关联"
|
||||
|
||||
#: awx/api/serializers.py:1440
|
||||
msgid "Cannot change the organization of an execution environment"
|
||||
@@ -629,7 +629,7 @@ msgstr "不支持在不替换的情况下在启动时删除 {} 凭证。提供
|
||||
|
||||
#: awx/api/serializers.py:4338
|
||||
msgid "The inventory associated with this Workflow is being deleted."
|
||||
msgstr "与此 Workflow 关联的清单将被删除。"
|
||||
msgstr "与此工作流关联的清单将被删除。"
|
||||
|
||||
#: awx/api/serializers.py:4405
|
||||
msgid "Message type '{}' invalid, must be either 'message' or 'body'"
|
||||
@@ -3229,7 +3229,7 @@ msgstr "云"
|
||||
#: awx/main/models/credential/__init__.py:336
|
||||
#: awx/main/models/credential/__init__.py:1113
|
||||
msgid "Container Registry"
|
||||
msgstr "容器 Registry"
|
||||
msgstr "容器注册表"
|
||||
|
||||
#: awx/main/models/credential/__init__.py:337
|
||||
msgid "Personal Access Token"
|
||||
@@ -3560,7 +3560,7 @@ msgstr "身份验证 URL"
|
||||
|
||||
#: awx/main/models/credential/__init__.py:1120
|
||||
msgid "Authentication endpoint for the container registry."
|
||||
msgstr "容器 registry 的身份验证端点。"
|
||||
msgstr "容器注册表的身份验证端点。"
|
||||
|
||||
#: awx/main/models/credential/__init__.py:1130
|
||||
msgid "Password or Token"
|
||||
@@ -3764,7 +3764,7 @@ msgstr "镜像位置"
|
||||
msgid ""
|
||||
"The full image location, including the container registry, image name, and "
|
||||
"version tag."
|
||||
msgstr "完整镜像位置,包括容器 registry、镜像名称和版本标签。"
|
||||
msgstr "完整镜像位置,包括容器注册表、镜像名称和版本标签。"
|
||||
|
||||
#: awx/main/models/execution_environments.py:51
|
||||
msgid "Pull image before running?"
|
||||
@@ -6238,4 +6238,5 @@ msgstr "%s 当前正在升级。"
|
||||
|
||||
#: awx/ui/urls.py:24
|
||||
msgid "This page will refresh when complete."
|
||||
msgstr "完成后,此页面会刷新。"
|
||||
msgstr "完成后,此页面会刷新。"
|
||||
|
||||
|
||||
@@ -396,7 +396,7 @@ def events_table_partitioned_modified(since, full_path, until, **kwargs):
|
||||
return _events_table(since, full_path, until, 'main_jobevent', 'modified', project_job_created=True, **kwargs)
|
||||
|
||||
|
||||
@register('unified_jobs_table', '1.3', format='csv', description=_('Data on jobs run'), expensive=four_hour_slicing)
|
||||
@register('unified_jobs_table', '1.4', format='csv', description=_('Data on jobs run'), expensive=four_hour_slicing)
|
||||
def unified_jobs_table(since, full_path, until, **kwargs):
|
||||
unified_job_query = '''COPY (SELECT main_unifiedjob.id,
|
||||
main_unifiedjob.polymorphic_ctype_id,
|
||||
@@ -422,7 +422,8 @@ def unified_jobs_table(since, full_path, until, **kwargs):
|
||||
main_unifiedjob.job_explanation,
|
||||
main_unifiedjob.instance_group_id,
|
||||
main_unifiedjob.installed_collections,
|
||||
main_unifiedjob.ansible_version
|
||||
main_unifiedjob.ansible_version,
|
||||
main_job.forks
|
||||
FROM main_unifiedjob
|
||||
JOIN django_content_type ON main_unifiedjob.polymorphic_ctype_id = django_content_type.id
|
||||
LEFT JOIN main_job ON main_unifiedjob.id = main_job.unifiedjob_ptr_id
|
||||
|
||||
@@ -408,6 +408,7 @@ class JobNotificationMixin(object):
|
||||
'inventory': 'Stub Inventory',
|
||||
'id': 42,
|
||||
'hosts': {},
|
||||
'extra_vars': {},
|
||||
'friendly_name': 'Job',
|
||||
'finished': False,
|
||||
'credential': 'Stub credential',
|
||||
|
||||
@@ -114,13 +114,6 @@ class Organization(CommonModel, NotificationFieldsModel, ResourceMixin, CustomVi
|
||||
def _get_related_jobs(self):
|
||||
return UnifiedJob.objects.non_polymorphic().filter(organization=self)
|
||||
|
||||
def create_default_galaxy_credential(self):
|
||||
from awx.main.models import Credential
|
||||
|
||||
public_galaxy_credential = Credential.objects.filter(managed=True, name='Ansible Galaxy').first()
|
||||
if public_galaxy_credential is not None and public_galaxy_credential not in self.galaxy_credentials.all():
|
||||
self.galaxy_credentials.add(public_galaxy_credential)
|
||||
|
||||
|
||||
class OrganizationGalaxyCredentialMembership(models.Model):
|
||||
|
||||
|
||||
@@ -659,6 +659,13 @@ class WorkflowJob(UnifiedJob, WorkflowJobOptions, SurveyJobMixin, JobNotificatio
|
||||
node_job_description = 'job #{0}, "{1}", which finished with status {2}.'.format(node.job.id, node.job.name, node.job.status)
|
||||
str_arr.append("- node #{0} spawns {1}".format(node.id, node_job_description))
|
||||
result['body'] = '\n'.join(str_arr)
|
||||
result.update(
|
||||
dict(
|
||||
inventory=self.inventory.name if self.inventory else None,
|
||||
limit=self.limit,
|
||||
extra_vars=self.display_extra_vars(),
|
||||
)
|
||||
)
|
||||
return result
|
||||
|
||||
@property
|
||||
@@ -906,3 +913,12 @@ class WorkflowApproval(UnifiedJob, JobNotificationMixin):
|
||||
@property
|
||||
def workflow_job(self):
|
||||
return self.unified_job_node.workflow_job
|
||||
|
||||
def notification_data(self):
|
||||
result = super(WorkflowApproval, self).notification_data()
|
||||
result.update(
|
||||
dict(
|
||||
extra_vars=self.workflow_job.display_extra_vars(),
|
||||
)
|
||||
)
|
||||
return result
|
||||
|
||||
@@ -409,7 +409,7 @@ def emit_activity_stream_change(instance):
|
||||
from awx.api.serializers import ActivityStreamSerializer
|
||||
|
||||
actor = None
|
||||
if instance.actor:
|
||||
if instance.actor_id:
|
||||
actor = instance.actor.username
|
||||
summary_fields = ActivityStreamSerializer(instance).get_summary_fields(instance)
|
||||
analytics_logger.info(
|
||||
|
||||
@@ -20,7 +20,7 @@ def test_activity_stream_related():
|
||||
"""
|
||||
serializer_related = set(
|
||||
ActivityStream._meta.get_field(field_name).related_model
|
||||
for field_name, stuff in ActivityStreamSerializer()._local_summarizable_fk_fields
|
||||
for field_name, stuff in ActivityStreamSerializer()._local_summarizable_fk_fields(None)
|
||||
if hasattr(ActivityStream, field_name)
|
||||
)
|
||||
|
||||
|
||||
@@ -79,6 +79,19 @@ def test_invalid_field():
|
||||
assert 'is not an allowed field name. Must be ascii encodable.' in str(excinfo.value)
|
||||
|
||||
|
||||
def test_valid_iexact():
|
||||
field_lookup = FieldLookupBackend()
|
||||
value, new_lookup, _ = field_lookup.value_to_python(JobTemplate, 'project__name__iexact', 'foo')
|
||||
assert 'foo' in value
|
||||
|
||||
|
||||
def test_invalid_iexact():
|
||||
field_lookup = FieldLookupBackend()
|
||||
with pytest.raises(ValueError) as excinfo:
|
||||
field_lookup.value_to_python(Job, 'id__iexact', '1')
|
||||
assert 'is not a text field and cannot be filtered by case-insensitive search' in str(excinfo.value)
|
||||
|
||||
|
||||
@pytest.mark.parametrize('lookup_suffix', ['', 'contains', 'startswith', 'in'])
|
||||
@pytest.mark.parametrize('password_field', Credential.PASSWORD_FIELDS)
|
||||
def test_filter_on_password_field(password_field, lookup_suffix):
|
||||
|
||||
@@ -1537,9 +1537,11 @@ register(
|
||||
('is_superuser_attr', 'saml_attr'),
|
||||
('is_superuser_value', 'value'),
|
||||
('is_superuser_role', 'saml_role'),
|
||||
('remove_superusers', True),
|
||||
('is_system_auditor_attr', 'saml_attr'),
|
||||
('is_system_auditor_value', 'value'),
|
||||
('is_system_auditor_role', 'saml_role'),
|
||||
('remove_system_auditors', True),
|
||||
],
|
||||
)
|
||||
|
||||
|
||||
@@ -743,8 +743,10 @@ class SAMLUserFlagsAttrField(HybridDictField):
|
||||
is_superuser_attr = fields.CharField(required=False, allow_null=True)
|
||||
is_superuser_value = fields.CharField(required=False, allow_null=True)
|
||||
is_superuser_role = fields.CharField(required=False, allow_null=True)
|
||||
remove_superusers = fields.BooleanField(required=False, allow_null=True)
|
||||
is_system_auditor_attr = fields.CharField(required=False, allow_null=True)
|
||||
is_system_auditor_value = fields.CharField(required=False, allow_null=True)
|
||||
is_system_auditor_role = fields.CharField(required=False, allow_null=True)
|
||||
remove_system_auditors = fields.BooleanField(required=False, allow_null=True)
|
||||
|
||||
child = _Forbidden()
|
||||
|
||||
@@ -77,6 +77,21 @@ def _update_m2m_from_expression(user, related, expr, remove=True):
|
||||
related.remove(user)
|
||||
|
||||
|
||||
def get_or_create_with_default_galaxy_cred(**kwargs):
|
||||
from awx.main.models import Organization, Credential
|
||||
|
||||
(org, org_created) = Organization.objects.get_or_create(**kwargs)
|
||||
if org_created:
|
||||
logger.debug("Created org {} (id {}) from {}".format(org.name, org.id, kwargs))
|
||||
public_galaxy_credential = Credential.objects.filter(managed=True, name='Ansible Galaxy').first()
|
||||
if public_galaxy_credential is not None:
|
||||
org.galaxy_credentials.add(public_galaxy_credential)
|
||||
logger.debug("Added default Ansible Galaxy credential to org")
|
||||
else:
|
||||
logger.debug("Could not find default Ansible Galaxy credential to add to org")
|
||||
return org
|
||||
|
||||
|
||||
def _update_org_from_attr(user, related, attr, remove, remove_admins, remove_auditors, backend):
|
||||
from awx.main.models import Organization
|
||||
from django.conf import settings
|
||||
@@ -94,8 +109,7 @@ def _update_org_from_attr(user, related, attr, remove, remove_admins, remove_aud
|
||||
organization_name = org_name
|
||||
except Exception:
|
||||
organization_name = org_name
|
||||
org = Organization.objects.get_or_create(name=organization_name)[0]
|
||||
org.create_default_galaxy_credential()
|
||||
org = get_or_create_with_default_galaxy_cred(name=organization_name)
|
||||
else:
|
||||
org = Organization.objects.get(name=org_name)
|
||||
except ObjectDoesNotExist:
|
||||
@@ -121,7 +135,6 @@ def update_user_orgs(backend, details, user=None, *args, **kwargs):
|
||||
"""
|
||||
if not user:
|
||||
return
|
||||
from awx.main.models import Organization
|
||||
|
||||
org_map = backend.setting('ORGANIZATION_MAP') or {}
|
||||
for org_name, org_opts in org_map.items():
|
||||
@@ -130,8 +143,7 @@ def update_user_orgs(backend, details, user=None, *args, **kwargs):
|
||||
organization_name = organization_alias
|
||||
else:
|
||||
organization_name = org_name
|
||||
org = Organization.objects.get_or_create(name=organization_name)[0]
|
||||
org.create_default_galaxy_credential()
|
||||
org = get_or_create_with_default_galaxy_cred(name=organization_name)
|
||||
|
||||
# Update org admins from expression(s).
|
||||
remove = bool(org_opts.get('remove', True))
|
||||
@@ -152,15 +164,14 @@ def update_user_teams(backend, details, user=None, *args, **kwargs):
|
||||
"""
|
||||
if not user:
|
||||
return
|
||||
from awx.main.models import Organization, Team
|
||||
from awx.main.models import Team
|
||||
|
||||
team_map = backend.setting('TEAM_MAP') or {}
|
||||
for team_name, team_opts in team_map.items():
|
||||
# Get or create the org to update.
|
||||
if 'organization' not in team_opts:
|
||||
continue
|
||||
org = Organization.objects.get_or_create(name=team_opts['organization'])[0]
|
||||
org.create_default_galaxy_credential()
|
||||
org = get_or_create_with_default_galaxy_cred(name=team_opts['organization'])
|
||||
|
||||
# Update team members from expression(s).
|
||||
team = Team.objects.get_or_create(name=team_name, organization=org)[0]
|
||||
@@ -216,8 +227,7 @@ def update_user_teams_by_saml_attr(backend, details, user=None, *args, **kwargs)
|
||||
|
||||
try:
|
||||
if settings.SAML_AUTO_CREATE_OBJECTS:
|
||||
org = Organization.objects.get_or_create(name=organization_name)[0]
|
||||
org.create_default_galaxy_credential()
|
||||
org = get_or_create_with_default_galaxy_cred(name=organization_name)
|
||||
else:
|
||||
org = Organization.objects.get(name=organization_name)
|
||||
except ObjectDoesNotExist:
|
||||
@@ -245,6 +255,7 @@ def _check_flag(user, flag, attributes, user_flags_settings):
|
||||
is_role_key = "is_%s_role" % (flag)
|
||||
is_attr_key = "is_%s_attr" % (flag)
|
||||
is_value_key = "is_%s_value" % (flag)
|
||||
remove_setting = "remove_%ss" % (flag)
|
||||
|
||||
# Check to see if we are respecting a role and, if so, does our user have that role?
|
||||
role_setting = user_flags_settings.get(is_role_key, None)
|
||||
@@ -276,7 +287,7 @@ def _check_flag(user, flag, attributes, user_flags_settings):
|
||||
# if they don't match make sure that new_flag is false
|
||||
else:
|
||||
logger.debug(
|
||||
"Refusing %s for %s because attr %s (%s) did not match value '%s'"
|
||||
"For %s on %s attr %s (%s) did not match expected value '%s'"
|
||||
% (flag, user.username, attr_setting, attribute_value, user_flags_settings.get(is_value_key))
|
||||
)
|
||||
new_flag = False
|
||||
@@ -285,8 +296,16 @@ def _check_flag(user, flag, attributes, user_flags_settings):
|
||||
logger.debug("Giving %s %s from attribute %s" % (user.username, flag, attr_setting))
|
||||
new_flag = True
|
||||
|
||||
# If the user was flagged and we are going to make them not flagged make sure there is a message
|
||||
# Get the users old flag
|
||||
old_value = getattr(user, "is_%s" % (flag))
|
||||
|
||||
# If we are not removing the flag and they were a system admin and now we don't want them to be just return
|
||||
remove_flag = user_flags_settings.get(remove_setting, True)
|
||||
if not remove_flag and (old_value and not new_flag):
|
||||
logger.debug("Remove flag %s preventing removal of %s for %s" % (remove_flag, flag, user.username))
|
||||
return old_value, False
|
||||
|
||||
# If the user was flagged and we are going to make them not flagged make sure there is a message
|
||||
if old_value and not new_flag:
|
||||
logger.debug("Revoking %s from %s" % (flag, user.username))
|
||||
|
||||
|
||||
@@ -4,8 +4,8 @@ from unittest import mock
|
||||
|
||||
from django.utils.timezone import now
|
||||
|
||||
from awx.conf.registry import settings_registry
|
||||
from awx.sso.pipeline import update_user_orgs, update_user_teams, update_user_orgs_by_saml_attr, update_user_teams_by_saml_attr, _check_flag
|
||||
|
||||
from awx.main.models import User, Team, Organization, Credential, CredentialType
|
||||
|
||||
|
||||
@@ -92,8 +92,13 @@ class TestSAMLMap:
|
||||
assert Organization.objects.get(name="Default_Alias") is not None
|
||||
|
||||
for o in Organization.objects.all():
|
||||
assert o.galaxy_credentials.count() == 1
|
||||
assert o.galaxy_credentials.first().name == 'Ansible Galaxy'
|
||||
if o.name == 'Default':
|
||||
# The default org was already created and should not have a galaxy credential
|
||||
assert o.galaxy_credentials.count() == 0
|
||||
else:
|
||||
# The Default_Alias was created by SAML and should get the galaxy credential
|
||||
assert o.galaxy_credentials.count() == 1
|
||||
assert o.galaxy_credentials.first().name == 'Ansible Galaxy'
|
||||
|
||||
def test_update_user_teams(self, backend, users, galaxy_credential):
|
||||
u1, u2, u3 = users
|
||||
@@ -203,7 +208,13 @@ class TestSAMLAttr:
|
||||
],
|
||||
}
|
||||
|
||||
return MockSettings()
|
||||
mock_settings_obj = MockSettings()
|
||||
for key in settings_registry.get_registered_settings(category_slug='logging'):
|
||||
value = settings_registry.get_setting_field(key).get_default()
|
||||
setattr(mock_settings_obj, key, value)
|
||||
setattr(mock_settings_obj, 'DEBUG', True)
|
||||
|
||||
return mock_settings_obj
|
||||
|
||||
@pytest.fixture
|
||||
def backend(self):
|
||||
@@ -263,8 +274,13 @@ class TestSAMLAttr:
|
||||
assert Organization.objects.get(name="o1_alias").member_role.members.count() == 1
|
||||
|
||||
for o in Organization.objects.all():
|
||||
assert o.galaxy_credentials.count() == 1
|
||||
assert o.galaxy_credentials.first().name == 'Ansible Galaxy'
|
||||
if o.id in [o1.id, o2.id, o3.id]:
|
||||
# o[123] were created without a default galaxy cred
|
||||
assert o.galaxy_credentials.count() == 0
|
||||
else:
|
||||
# anything else created should have a default galaxy cred
|
||||
assert o.galaxy_credentials.count() == 1
|
||||
assert o.galaxy_credentials.first().name == 'Ansible Galaxy'
|
||||
|
||||
def test_update_user_teams_by_saml_attr(self, orgs, users, galaxy_credential, kwargs, mock_settings):
|
||||
with mock.patch('django.conf.settings', mock_settings):
|
||||
@@ -322,8 +338,13 @@ class TestSAMLAttr:
|
||||
assert Team.objects.get(name='Green', organization__name='Default3').member_role.members.count() == 3
|
||||
|
||||
for o in Organization.objects.all():
|
||||
assert o.galaxy_credentials.count() == 1
|
||||
assert o.galaxy_credentials.first().name == 'Ansible Galaxy'
|
||||
if o.id in [o1.id, o2.id, o3.id]:
|
||||
# o[123] were created without a default galaxy cred
|
||||
assert o.galaxy_credentials.count() == 0
|
||||
else:
|
||||
# anything else created should have a default galaxy cred
|
||||
assert o.galaxy_credentials.count() == 1
|
||||
assert o.galaxy_credentials.first().name == 'Ansible Galaxy'
|
||||
|
||||
def test_update_user_teams_alias_by_saml_attr(self, orgs, users, galaxy_credential, kwargs, mock_settings):
|
||||
with mock.patch('django.conf.settings', mock_settings):
|
||||
@@ -396,73 +417,113 @@ class TestSAMLAttr:
|
||||
assert o.galaxy_credentials.count() == 1
|
||||
assert o.galaxy_credentials.first().name == 'Ansible Galaxy'
|
||||
|
||||
def test_galaxy_credential_no_auto_assign(self, users, kwargs, galaxy_credential, mock_settings):
|
||||
# A Galaxy credential should not be added to an existing org
|
||||
o = Organization.objects.create(name='Default1')
|
||||
o = Organization.objects.create(name='Default2')
|
||||
o = Organization.objects.create(name='Default3')
|
||||
o = Organization.objects.create(name='Default4')
|
||||
kwargs['response']['attributes']['memberOf'] = ['Default1']
|
||||
kwargs['response']['attributes']['groups'] = ['Blue']
|
||||
with mock.patch('django.conf.settings', mock_settings):
|
||||
for u in users:
|
||||
update_user_orgs_by_saml_attr(None, None, u, **kwargs)
|
||||
update_user_teams_by_saml_attr(None, None, u, **kwargs)
|
||||
|
||||
assert Organization.objects.count() == 4
|
||||
for o in Organization.objects.all():
|
||||
assert o.galaxy_credentials.count() == 0
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
class TestSAMLUserFlags:
|
||||
@pytest.mark.parametrize(
|
||||
"user_flags_settings, expected",
|
||||
"user_flags_settings, expected, is_superuser",
|
||||
[
|
||||
# In this case we will pass no user flags so new_flag should be false and changed will def be false
|
||||
(
|
||||
{},
|
||||
(False, False),
|
||||
False,
|
||||
),
|
||||
# In this case we will give the user a group to make them an admin
|
||||
(
|
||||
{'is_superuser_role': 'test-role-1'},
|
||||
(True, True),
|
||||
False,
|
||||
),
|
||||
# In this case we will give the user a flag that will make then an admin
|
||||
(
|
||||
{'is_superuser_attr': 'is_superuser'},
|
||||
(True, True),
|
||||
False,
|
||||
),
|
||||
# In this case we will give the user a flag but the wrong value
|
||||
(
|
||||
{'is_superuser_attr': 'is_superuser', 'is_superuser_value': 'junk'},
|
||||
(False, False),
|
||||
False,
|
||||
),
|
||||
# In this case we will give the user a flag and the right value
|
||||
(
|
||||
{'is_superuser_attr': 'is_superuser', 'is_superuser_value': 'true'},
|
||||
(True, True),
|
||||
False,
|
||||
),
|
||||
# In this case we will give the user a proper role and an is_superuser_attr role that they dont have, this should make them an admin
|
||||
(
|
||||
{'is_superuser_role': 'test-role-1', 'is_superuser_attr': 'gibberish', 'is_superuser_value': 'true'},
|
||||
(True, True),
|
||||
False,
|
||||
),
|
||||
# In this case we will give the user a proper role and an is_superuser_attr role that they have, this should make them an admin
|
||||
(
|
||||
{'is_superuser_role': 'test-role-1', 'is_superuser_attr': 'test-role-1'},
|
||||
(True, True),
|
||||
False,
|
||||
),
|
||||
# In this case we will give the user a proper role and an is_superuser_attr role that they have but a bad value, this should make them an admin
|
||||
(
|
||||
{'is_superuser_role': 'test-role-1', 'is_superuser_attr': 'is_superuser', 'is_superuser_value': 'junk'},
|
||||
(False, False),
|
||||
False,
|
||||
),
|
||||
# In this case we will give the user everything
|
||||
(
|
||||
{'is_superuser_role': 'test-role-1', 'is_superuser_attr': 'is_superuser', 'is_superuser_value': 'true'},
|
||||
(True, True),
|
||||
False,
|
||||
),
|
||||
# In this test case we will validate that a single attribute (instead of a list) still works
|
||||
(
|
||||
{'is_superuser_attr': 'name_id', 'is_superuser_value': 'test_id'},
|
||||
(True, True),
|
||||
False,
|
||||
),
|
||||
# This will be a negative test for a single atrribute
|
||||
(
|
||||
{'is_superuser_attr': 'name_id', 'is_superuser_value': 'junk'},
|
||||
(False, False),
|
||||
False,
|
||||
),
|
||||
# The user is already a superuser so we should remove them
|
||||
(
|
||||
{'is_superuser_attr': 'name_id', 'is_superuser_value': 'junk', 'remove_superusers': True},
|
||||
(False, True),
|
||||
True,
|
||||
),
|
||||
# The user is already a superuser but we don't have a remove field
|
||||
(
|
||||
{'is_superuser_attr': 'name_id', 'is_superuser_value': 'junk', 'remove_superusers': False},
|
||||
(True, False),
|
||||
True,
|
||||
),
|
||||
],
|
||||
)
|
||||
def test__check_flag(self, user_flags_settings, expected):
|
||||
def test__check_flag(self, user_flags_settings, expected, is_superuser):
|
||||
user = User()
|
||||
user.username = 'John'
|
||||
user.is_superuser = False
|
||||
user.is_superuser = is_superuser
|
||||
|
||||
attributes = {
|
||||
'email': ['noone@nowhere.com'],
|
||||
|
||||
@@ -123,9 +123,11 @@ class TestSAMLUserFlagsAttrField:
|
||||
{'is_superuser_attr': 'something'},
|
||||
{'is_superuser_value': 'value'},
|
||||
{'is_superuser_role': 'my_peeps'},
|
||||
{'remove_superusers': False},
|
||||
{'is_system_auditor_attr': 'something_else'},
|
||||
{'is_system_auditor_value': 'value2'},
|
||||
{'is_system_auditor_role': 'other_peeps'},
|
||||
{'remove_system_auditors': False},
|
||||
],
|
||||
)
|
||||
def test_internal_value_valid(self, data):
|
||||
@@ -165,6 +167,17 @@ class TestSAMLUserFlagsAttrField:
|
||||
'junk2': ['Invalid field.'],
|
||||
},
|
||||
),
|
||||
# make sure we can't pass a string to the boolean fields
|
||||
(
|
||||
{
|
||||
'remove_superusers': 'test',
|
||||
'remove_system_auditors': 'test',
|
||||
},
|
||||
{
|
||||
"remove_superusers": ["Must be a valid boolean."],
|
||||
"remove_system_auditors": ["Must be a valid boolean."],
|
||||
},
|
||||
),
|
||||
],
|
||||
)
|
||||
def test_internal_value_invalid(self, data, expected):
|
||||
|
||||
@@ -2,15 +2,39 @@ This document is meant to provide some guidance into the functionality of Job Ou
|
||||
|
||||
## Overview of the feature/screen. Summary of what it does/is
|
||||
|
||||
1. Elapsed time / unfollow button
|
||||
2. Page up and page down buttons
|
||||
3. Unique qualities of the different job types.
|
||||
Joboutput is a feature that allows users to see how their job is doing as it is being run.
|
||||
This feature displays data sent to the UI via websockets that are connected to several
|
||||
different endpoints in the API.
|
||||
|
||||
- Some don’t allow search by event data and thus Event is not an option in the drop down
|
||||
- Some don’t have expand, collapse
|
||||
The job output has 2 different states that result in different functionality. One state
|
||||
is when, the job is actively running. There is limited functionality because of how the
|
||||
job events are processed when they reach the UI. While the job is running, and
|
||||
output is coming into the UI, the following features turn off:
|
||||
|
||||
4. Differences in the output from when a job is running and when a job is complete.
|
||||
5. Which features are enabled when it’s running and which aren’t.
|
||||
1. [Search](#Search)- The ability to search the output of a job.
|
||||
2. [Expand/Collapse](#Expand/Collapse)- The ability to expand and collapse job events, tasks, plays, or even the
|
||||
job itself. The only part of the job ouput that is not collapsable is the playbook summary (only jobs that
|
||||
are executed from a Job Template have Expand/Collapse functionality).
|
||||
|
||||
The following features are enabled:
|
||||
|
||||
1. Follow/unfollow - `Follow` indicates you are streaming the output on the screen
|
||||
as it comes into the UI. If you see some output that you want to examine closer while the job is running
|
||||
scroll to it, and click `Unfollow`, and the output will stop streaming onto the screen. This feature is only
|
||||
enabled when the job is running and is not complete. If the user scrolls up in the output the UI will unfollow.
|
||||
2. Page up and page down buttons- Use these buttons to navigate quickly up and down the output.
|
||||
|
||||

|
||||
|
||||
After the job is complete, the Follow/Unfollow button disabled, and Expand/Collapse and Search become enabled.
|
||||

|
||||
|
||||
Not all job types are created equal. Some jobs have a concept of parent-child events. Job events can be inside a Task,
|
||||
a Task can be inside a Play, and a Play inside a Playbook. Leveraging this concept to enable Expand/Collapse for these
|
||||
job types, allows you to collapse and hide the children of a particular line of output. This parent-child event
|
||||
relationship only exists on jobs executed from a job template. All other types of jobs do not
|
||||
have this event concept, and therefore, do not have Expand/Collapse functionality. By default all job
|
||||
events are expanded.
|
||||
|
||||
## How output works generally.
|
||||
|
||||
@@ -26,11 +50,13 @@ This document is meant to provide some guidance into the functionality of Job Ou
|
||||
## Non-standard cases
|
||||
|
||||
1. When an event comes into the output that has a parent, but the parent hasn’t arrived yet.
|
||||
2. When an event that has children arrives in output, but the children are not present yet
|
||||
2. When an event with children arrives in output, but the children are not yet present.
|
||||
|
||||
## Expand collapse a single event- how it works and how it changes the state object
|
||||
## Expand/Collapse
|
||||
|
||||
## Expand collapse all- how it works and how it changes the state object
|
||||
### Expand collapse a single event - how it works and how it changes the state object
|
||||
|
||||
### Expand collapse all - how it works and how it changes the state object
|
||||
|
||||
## Search
|
||||
|
||||
|
||||
BIN
awx/ui/docs/images/JobOutput-complete.png
Normal file
BIN
awx/ui/docs/images/JobOutput-complete.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 22 KiB |
BIN
awx/ui/docs/images/JobOutput-running.png
Normal file
BIN
awx/ui/docs/images/JobOutput-running.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 26 KiB |
@@ -24,7 +24,7 @@
|
||||
</script>
|
||||
<meta
|
||||
http-equiv="Content-Security-Policy"
|
||||
content="default-src 'self'; connect-src 'self' ws: wss:; style-src 'self' 'unsafe-inline'; script-src 'self' 'nonce-{{ csp_nonce }}' *.pendo.io https://d3js.org; img-src 'self' *.pendo.io data:; worker-src 'self' blob: ;"
|
||||
content="default-src 'self'; connect-src 'self' ws: wss:; style-src 'self' 'unsafe-inline'; script-src 'self' 'nonce-{{ csp_nonce }}' *.pendo.io; img-src 'self' *.pendo.io data:; worker-src 'self' blob: ;"
|
||||
/>
|
||||
<link rel="shortcut icon" href="{% static 'media/favicon.ico' %}" />
|
||||
<% } else { %>
|
||||
|
||||
3
awx/ui/public/static/js/d3-collection.v1.min.js
vendored
Normal file
3
awx/ui/public/static/js/d3-collection.v1.min.js
vendored
Normal file
@@ -0,0 +1,3 @@
|
||||
/* eslint-disable */
|
||||
// https://d3js.org/d3-collection/ v1.0.7 Copyright 2018 Mike Bostock
|
||||
!function(n,t){"object"==typeof exports&&"undefined"!=typeof module?t(exports):"function"==typeof define&&define.amd?define(["exports"],t):t(n.d3=n.d3||{})}(this,function(n){"use strict";function t(){}function e(n,e){var r=new t;if(n instanceof t)n.each(function(n,t){r.set(t,n)});else if(Array.isArray(n)){var i,u=-1,o=n.length;if(null==e)for(;++u<o;)r.set(u,n[u]);else for(;++u<o;)r.set(e(i=n[u],u,n),i)}else if(n)for(var s in n)r.set(s,n[s]);return r}function r(){return{}}function i(n,t,e){n[t]=e}function u(){return e()}function o(n,t,e){n.set(t,e)}function s(){}t.prototype=e.prototype={constructor:t,has:function(n){return"$"+n in this},get:function(n){return this["$"+n]},set:function(n,t){return this["$"+n]=t,this},remove:function(n){var t="$"+n;return t in this&&delete this[t]},clear:function(){for(var n in this)"$"===n[0]&&delete this[n]},keys:function(){var n=[];for(var t in this)"$"===t[0]&&n.push(t.slice(1));return n},values:function(){var n=[];for(var t in this)"$"===t[0]&&n.push(this[t]);return n},entries:function(){var n=[];for(var t in this)"$"===t[0]&&n.push({key:t.slice(1),value:this[t]});return n},size:function(){var n=0;for(var t in this)"$"===t[0]&&++n;return n},empty:function(){for(var n in this)if("$"===n[0])return!1;return!0},each:function(n){for(var t in this)"$"===t[0]&&n(this[t],t.slice(1),this)}};var f=e.prototype;function c(n,t){var e=new s;if(n instanceof s)n.each(function(n){e.add(n)});else if(n){var r=-1,i=n.length;if(null==t)for(;++r<i;)e.add(n[r]);else for(;++r<i;)e.add(t(n[r],r,n))}return e}s.prototype=c.prototype={constructor:s,has:f.has,add:function(n){return this["$"+(n+="")]=n,this},remove:f.remove,clear:f.clear,values:f.keys,size:f.size,empty:f.empty,each:f.each},n.nest=function(){var n,t,s,f=[],c=[];function a(r,i,u,o){if(i>=f.length)return null!=n&&r.sort(n),null!=t?t(r):r;for(var s,c,h,l=-1,v=r.length,p=f[i++],y=e(),d=u();++l<v;)(h=y.get(s=p(c=r[l])+""))?h.push(c):y.set(s,[c]);return y.each(function(n,t){o(d,t,a(n,i,u,o))}),d}return s={object:function(n){return a(n,0,r,i)},map:function(n){return a(n,0,u,o)},entries:function(n){return function n(e,r){if(++r>f.length)return e;var i,u=c[r-1];return null!=t&&r>=f.length?i=e.entries():(i=[],e.each(function(t,e){i.push({key:e,values:n(t,r)})})),null!=u?i.sort(function(n,t){return u(n.key,t.key)}):i}(a(n,0,u,o),0)},key:function(n){return f.push(n),s},sortKeys:function(n){return c[f.length-1]=n,s},sortValues:function(t){return n=t,s},rollup:function(n){return t=n,s}}},n.set=c,n.map=e,n.keys=function(n){var t=[];for(var e in n)t.push(e);return t},n.values=function(n){var t=[];for(var e in n)t.push(n[e]);return t},n.entries=function(n){var t=[];for(var e in n)t.push({key:e,value:n[e]});return t},Object.defineProperty(n,"__esModule",{value:!0})});
|
||||
3
awx/ui/public/static/js/d3-dispatch.v1.min.js
vendored
Normal file
3
awx/ui/public/static/js/d3-dispatch.v1.min.js
vendored
Normal file
@@ -0,0 +1,3 @@
|
||||
/* eslint-disable */
|
||||
// https://d3js.org/d3-dispatch/ v1.0.6 Copyright 2019 Mike Bostock
|
||||
!function(n,e){"object"==typeof exports&&"undefined"!=typeof module?e(exports):"function"==typeof define&&define.amd?define(["exports"],e):e((n=n||self).d3=n.d3||{})}(this,function(n){"use strict";var e={value:function(){}};function t(){for(var n,e=0,t=arguments.length,o={};e<t;++e){if(!(n=arguments[e]+"")||n in o||/[\s.]/.test(n))throw new Error("illegal type: "+n);o[n]=[]}return new r(o)}function r(n){this._=n}function o(n,e){return n.trim().split(/^|\s+/).map(function(n){var t="",r=n.indexOf(".");if(r>=0&&(t=n.slice(r+1),n=n.slice(0,r)),n&&!e.hasOwnProperty(n))throw new Error("unknown type: "+n);return{type:n,name:t}})}function i(n,e){for(var t,r=0,o=n.length;r<o;++r)if((t=n[r]).name===e)return t.value}function f(n,t,r){for(var o=0,i=n.length;o<i;++o)if(n[o].name===t){n[o]=e,n=n.slice(0,o).concat(n.slice(o+1));break}return null!=r&&n.push({name:t,value:r}),n}r.prototype=t.prototype={constructor:r,on:function(n,e){var t,r=this._,l=o(n+"",r),u=-1,a=l.length;if(!(arguments.length<2)){if(null!=e&&"function"!=typeof e)throw new Error("invalid callback: "+e);for(;++u<a;)if(t=(n=l[u]).type)r[t]=f(r[t],n.name,e);else if(null==e)for(t in r)r[t]=f(r[t],n.name,null);return this}for(;++u<a;)if((t=(n=l[u]).type)&&(t=i(r[t],n.name)))return t},copy:function(){var n={},e=this._;for(var t in e)n[t]=e[t].slice();return new r(n)},call:function(n,e){if((t=arguments.length-2)>0)for(var t,r,o=new Array(t),i=0;i<t;++i)o[i]=arguments[i+2];if(!this._.hasOwnProperty(n))throw new Error("unknown type: "+n);for(i=0,t=(r=this._[n]).length;i<t;++i)r[i].value.apply(e,o)},apply:function(n,e,t){if(!this._.hasOwnProperty(n))throw new Error("unknown type: "+n);for(var r=this._[n],o=0,i=r.length;o<i;++o)r[o].value.apply(e,t)}},n.dispatch=t,Object.defineProperty(n,"__esModule",{value:!0})});
|
||||
3
awx/ui/public/static/js/d3-force.v1.min.js
vendored
Normal file
3
awx/ui/public/static/js/d3-force.v1.min.js
vendored
Normal file
File diff suppressed because one or more lines are too long
3
awx/ui/public/static/js/d3-quadtree.v1.min.js
vendored
Normal file
3
awx/ui/public/static/js/d3-quadtree.v1.min.js
vendored
Normal file
File diff suppressed because one or more lines are too long
3
awx/ui/public/static/js/d3-timer.v1.min.js
vendored
Normal file
3
awx/ui/public/static/js/d3-timer.v1.min.js
vendored
Normal file
@@ -0,0 +1,3 @@
|
||||
/* eslint-disable */
|
||||
// https://d3js.org/d3-timer/ v1.0.10 Copyright 2019 Mike Bostock
|
||||
!function(t,n){"object"==typeof exports&&"undefined"!=typeof module?n(exports):"function"==typeof define&&define.amd?define(["exports"],n):n((t=t||self).d3=t.d3||{})}(this,function(t){"use strict";var n,e,o=0,i=0,r=0,u=1e3,l=0,c=0,f=0,a="object"==typeof performance&&performance.now?performance:Date,s="object"==typeof window&&window.requestAnimationFrame?window.requestAnimationFrame.bind(window):function(t){setTimeout(t,17)};function _(){return c||(s(m),c=a.now()+f)}function m(){c=0}function p(){this._call=this._time=this._next=null}function w(t,n,e){var o=new p;return o.restart(t,n,e),o}function d(){_(),++o;for(var t,e=n;e;)(t=c-e._time)>=0&&e._call.call(null,t),e=e._next;--o}function h(){c=(l=a.now())+f,o=i=0;try{d()}finally{o=0,function(){var t,o,i=n,r=1/0;for(;i;)i._call?(r>i._time&&(r=i._time),t=i,i=i._next):(o=i._next,i._next=null,i=t?t._next=o:n=o);e=t,v(r)}(),c=0}}function y(){var t=a.now(),n=t-l;n>u&&(f-=n,l=t)}function v(t){o||(i&&(i=clearTimeout(i)),t-c>24?(t<1/0&&(i=setTimeout(h,t-a.now()-f)),r&&(r=clearInterval(r))):(r||(l=a.now(),r=setInterval(y,u)),o=1,s(h)))}p.prototype=w.prototype={constructor:p,restart:function(t,o,i){if("function"!=typeof t)throw new TypeError("callback is not a function");i=(null==i?_():+i)+(null==o?0:+o),this._next||e===this||(e?e._next=this:n=this,e=this),this._call=t,this._time=i,v()},stop:function(){this._call&&(this._call=null,this._time=1/0,v())}},t.interval=function(t,n,e){var o=new p,i=n;return null==n?(o.restart(t,n,e),o):(n=+n,e=null==e?_():+e,o.restart(function r(u){u+=i,o.restart(r,i+=n,e),t(u)},n,e),o)},t.now=_,t.timeout=function(t,n,e){var o=new p;return n=null==n?0:+n,o.restart(function(e){o.stop(),t(e+n)},n,e),o},t.timer=w,t.timerFlush=d,Object.defineProperty(t,"__esModule",{value:!0})});
|
||||
@@ -41,6 +41,7 @@ const Detail = ({
|
||||
className,
|
||||
dataCy,
|
||||
alwaysVisible,
|
||||
isEmpty,
|
||||
helpText,
|
||||
isEncrypted,
|
||||
isNotConfigured,
|
||||
@@ -49,6 +50,10 @@ const Detail = ({
|
||||
return null;
|
||||
}
|
||||
|
||||
if (isEmpty && !alwaysVisible) {
|
||||
return null;
|
||||
}
|
||||
|
||||
const labelCy = dataCy ? `${dataCy}-label` : null;
|
||||
const valueCy = dataCy ? `${dataCy}-value` : null;
|
||||
|
||||
|
||||
@@ -163,16 +163,16 @@ function JobListItem({
|
||||
<Td colSpan={showTypeColumn ? 6 : 5}>
|
||||
<ExpandableRowContent>
|
||||
<DetailList>
|
||||
{job.type === 'inventory_update' &&
|
||||
inventorySourceLabels.length > 0 && (
|
||||
<Detail
|
||||
dataCy="job-inventory-source-type"
|
||||
label={t`Source`}
|
||||
value={inventorySourceLabels.map(([string, label]) =>
|
||||
string === job.source ? label : null
|
||||
)}
|
||||
/>
|
||||
)}
|
||||
{job.type === 'inventory_update' && (
|
||||
<Detail
|
||||
dataCy="job-inventory-source-type"
|
||||
label={t`Source`}
|
||||
value={inventorySourceLabels?.map(([string, label]) =>
|
||||
string === job.source ? label : null
|
||||
)}
|
||||
isEmpty={inventorySourceLabels?.length === 0}
|
||||
/>
|
||||
)}
|
||||
<LaunchedByDetail job={job} />
|
||||
{job.launch_type === 'scheduled' &&
|
||||
(schedule ? (
|
||||
@@ -254,7 +254,7 @@ function JobListItem({
|
||||
dataCy={`execution-environment-detail-${job.id}`}
|
||||
/>
|
||||
)}
|
||||
{credentials && credentials.length > 0 && (
|
||||
{credentials && (
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Credentials`}
|
||||
@@ -275,6 +275,7 @@ function JobListItem({
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
isEmpty={credentials.length === 0}
|
||||
/>
|
||||
)}
|
||||
{labels && labels.count > 0 && (
|
||||
|
||||
@@ -203,6 +203,49 @@ describe('<JobListItem />', () => {
|
||||
wrapper.find('Detail[label="Execution Environment"] dd').text()
|
||||
).toBe('Missing resource');
|
||||
});
|
||||
|
||||
test('should not load Source', () => {
|
||||
wrapper = mountWithContexts(
|
||||
<table>
|
||||
<tbody>
|
||||
<JobListItem
|
||||
inventorySourceLabels={[]}
|
||||
job={{
|
||||
...mockJob,
|
||||
type: 'inventory_update',
|
||||
summary_fields: {
|
||||
user_capabilities: {},
|
||||
},
|
||||
}}
|
||||
/>
|
||||
</tbody>
|
||||
</table>
|
||||
);
|
||||
const source_detail = wrapper.find(`Detail[label="Source"]`).at(0);
|
||||
expect(source_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
|
||||
test('should not load Credentials', () => {
|
||||
wrapper = mountWithContexts(
|
||||
<table>
|
||||
<tbody>
|
||||
<JobListItem
|
||||
job={{
|
||||
...mockJob,
|
||||
type: 'inventory_update',
|
||||
summary_fields: {
|
||||
credentials: [],
|
||||
},
|
||||
}}
|
||||
/>
|
||||
</tbody>
|
||||
</table>
|
||||
);
|
||||
const credentials_detail = wrapper
|
||||
.find(`Detail[label="Credentials"]`)
|
||||
.at(0);
|
||||
expect(credentials_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
});
|
||||
|
||||
describe('<JobListItem with failed job />', () => {
|
||||
|
||||
@@ -29,17 +29,11 @@ function PromptInventorySourceDetail({ resource }) {
|
||||
summary_fields,
|
||||
update_cache_timeout,
|
||||
update_on_launch,
|
||||
update_on_project_update,
|
||||
verbosity,
|
||||
} = resource;
|
||||
|
||||
let optionsList = '';
|
||||
if (
|
||||
overwrite ||
|
||||
overwrite_vars ||
|
||||
update_on_launch ||
|
||||
update_on_project_update
|
||||
) {
|
||||
if (overwrite || overwrite_vars || update_on_launch) {
|
||||
optionsList = (
|
||||
<TextList component={TextListVariants.ul}>
|
||||
{overwrite && (
|
||||
@@ -57,11 +51,6 @@ function PromptInventorySourceDetail({ resource }) {
|
||||
{t`Update on launch`}
|
||||
</TextListItem>
|
||||
)}
|
||||
{update_on_project_update && (
|
||||
<TextListItem component={TextListItemVariants.li}>
|
||||
{t`Update on project update`}
|
||||
</TextListItem>
|
||||
)}
|
||||
</TextList>
|
||||
);
|
||||
}
|
||||
@@ -113,15 +102,14 @@ function PromptInventorySourceDetail({ resource }) {
|
||||
label={t`Cache Timeout`}
|
||||
value={`${update_cache_timeout} ${t`Seconds`}`}
|
||||
/>
|
||||
{summary_fields?.credentials?.length > 0 && (
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Credential`}
|
||||
value={summary_fields.credentials.map((cred) => (
|
||||
<CredentialChip key={cred?.id} credential={cred} isReadOnly />
|
||||
))}
|
||||
/>
|
||||
)}
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Credential`}
|
||||
value={summary_fields?.credentials?.map((cred) => (
|
||||
<CredentialChip key={cred?.id} credential={cred} isReadOnly />
|
||||
))}
|
||||
isEmpty={summary_fields?.credentials?.length === 0}
|
||||
/>
|
||||
{source_regions && (
|
||||
<Detail
|
||||
fullWidth
|
||||
|
||||
@@ -67,7 +67,6 @@ describe('PromptInventorySourceDetail', () => {
|
||||
</li>,
|
||||
<li>Overwrite local variables from remote inventory source</li>,
|
||||
<li>Update on launch</li>,
|
||||
<li>Update on project update</li>,
|
||||
])
|
||||
).toEqual(true);
|
||||
});
|
||||
@@ -79,4 +78,19 @@ describe('PromptInventorySourceDetail', () => {
|
||||
);
|
||||
assertDetail(wrapper, 'Organization', 'Deleted');
|
||||
});
|
||||
|
||||
test('should not load Credentials', () => {
|
||||
wrapper = mountWithContexts(
|
||||
<PromptInventorySourceDetail
|
||||
resource={{
|
||||
...mockInvSource,
|
||||
summary_fields: {
|
||||
credentials: [],
|
||||
},
|
||||
}}
|
||||
/>
|
||||
);
|
||||
const credentials_detail = wrapper.find(`Detail[label="Credential"]`).at(0);
|
||||
expect(credentials_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
});
|
||||
|
||||
@@ -26,7 +26,7 @@ function PromptJobTemplateDetail({ resource }) {
|
||||
extra_vars,
|
||||
forks,
|
||||
host_config_key,
|
||||
instance_groups,
|
||||
instance_groups = [],
|
||||
job_slice_count,
|
||||
job_tags,
|
||||
job_type,
|
||||
@@ -94,9 +94,11 @@ function PromptJobTemplateDetail({ resource }) {
|
||||
|
||||
return (
|
||||
<>
|
||||
{summary_fields.recent_jobs?.length > 0 && (
|
||||
<Detail value={<Sparkline jobs={recentJobs} />} label={t`Activity`} />
|
||||
)}
|
||||
<Detail
|
||||
label={t`Activity`}
|
||||
value={<Sparkline jobs={recentJobs} />}
|
||||
isEmpty={summary_fields.recent_jobs?.length === 0}
|
||||
/>
|
||||
<Detail label={t`Job Type`} value={toTitleCase(job_type)} />
|
||||
{summary_fields?.organization ? (
|
||||
<Detail
|
||||
@@ -180,7 +182,7 @@ function PromptJobTemplateDetail({ resource }) {
|
||||
/>
|
||||
)}
|
||||
{optionsList && <Detail label={t`Enabled Options`} value={optionsList} />}
|
||||
{summary_fields?.credentials?.length > 0 && (
|
||||
{summary_fields?.credentials && (
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Credentials`}
|
||||
@@ -195,9 +197,10 @@ function PromptJobTemplateDetail({ resource }) {
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
isEmpty={summary_fields?.credentials?.length === 0}
|
||||
/>
|
||||
)}
|
||||
{summary_fields?.labels?.results?.length > 0 && (
|
||||
{summary_fields?.labels?.results && (
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Labels`}
|
||||
@@ -214,28 +217,28 @@ function PromptJobTemplateDetail({ resource }) {
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
isEmpty={summary_fields?.labels?.results?.length === 0}
|
||||
/>
|
||||
)}
|
||||
{instance_groups?.length > 0 && (
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Instance Groups`}
|
||||
value={
|
||||
<ChipGroup
|
||||
numChips={5}
|
||||
totalChips={instance_groups.length}
|
||||
ouiaId="prompt-jt-instance-group-chips"
|
||||
>
|
||||
{instance_groups.map((ig) => (
|
||||
<Chip key={ig.id} isReadOnly>
|
||||
{ig.name}
|
||||
</Chip>
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
/>
|
||||
)}
|
||||
{job_tags?.length > 0 && (
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Instance Groups`}
|
||||
value={
|
||||
<ChipGroup
|
||||
numChips={5}
|
||||
totalChips={instance_groups?.length}
|
||||
ouiaId="prompt-jt-instance-group-chips"
|
||||
>
|
||||
{instance_groups?.map((ig) => (
|
||||
<Chip key={ig.id} isReadOnly>
|
||||
{ig.name}
|
||||
</Chip>
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
isEmpty={instance_groups?.length === 0}
|
||||
/>
|
||||
{job_tags && (
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Job Tags`}
|
||||
@@ -252,9 +255,10 @@ function PromptJobTemplateDetail({ resource }) {
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
isEmpty={job_tags?.length === 0}
|
||||
/>
|
||||
)}
|
||||
{skip_tags?.length > 0 && (
|
||||
{skip_tags && (
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Skip Tags`}
|
||||
@@ -271,6 +275,7 @@ function PromptJobTemplateDetail({ resource }) {
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
isEmpty={skip_tags?.length === 0}
|
||||
/>
|
||||
)}
|
||||
{extra_vars && (
|
||||
|
||||
@@ -125,4 +125,92 @@ describe('PromptJobTemplateDetail', () => {
|
||||
assertDetail(wrapper, 'Organization', 'Deleted');
|
||||
assertDetail(wrapper, 'Project', 'Deleted');
|
||||
});
|
||||
|
||||
test('should not load Activity', () => {
|
||||
wrapper = mountWithContexts(
|
||||
<PromptJobTemplateDetail
|
||||
resource={{
|
||||
...mockJT,
|
||||
summary_fields: {
|
||||
recent_jobs: [],
|
||||
},
|
||||
}}
|
||||
/>
|
||||
);
|
||||
const activity_detail = wrapper.find(`Detail[label="Activity"]`).at(0);
|
||||
expect(activity_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
|
||||
test('should not load Credentials', () => {
|
||||
wrapper = mountWithContexts(
|
||||
<PromptJobTemplateDetail
|
||||
resource={{
|
||||
...mockJT,
|
||||
summary_fields: {
|
||||
credentials: [],
|
||||
},
|
||||
}}
|
||||
/>
|
||||
);
|
||||
const credentials_detail = wrapper
|
||||
.find(`Detail[label="Credentials"]`)
|
||||
.at(0);
|
||||
expect(credentials_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
|
||||
test('should not load Labels', () => {
|
||||
wrapper = mountWithContexts(
|
||||
<PromptJobTemplateDetail
|
||||
resource={{
|
||||
...mockJT,
|
||||
summary_fields: {
|
||||
labels: {
|
||||
results: [],
|
||||
},
|
||||
},
|
||||
}}
|
||||
/>
|
||||
);
|
||||
const labels_detail = wrapper.find(`Detail[label="Labels"]`).at(0);
|
||||
expect(labels_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
|
||||
test('should not load Instance Groups', () => {
|
||||
wrapper = mountWithContexts(
|
||||
<PromptJobTemplateDetail
|
||||
resource={{
|
||||
...mockJT,
|
||||
instance_groups: [],
|
||||
}}
|
||||
/>
|
||||
);
|
||||
const instance_groups_detail = wrapper
|
||||
.find(`Detail[label="Instance Groups"]`)
|
||||
.at(0);
|
||||
expect(instance_groups_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
|
||||
test('should not load Job Tags', () => {
|
||||
wrapper = mountWithContexts(
|
||||
<PromptJobTemplateDetail
|
||||
resource={{
|
||||
...mockJT,
|
||||
job_tags: '',
|
||||
}}
|
||||
/>
|
||||
);
|
||||
expect(wrapper.find('Detail[label="Job Tags"]').length).toBe(0);
|
||||
});
|
||||
|
||||
test('should not load Skip Tags', () => {
|
||||
wrapper = mountWithContexts(
|
||||
<PromptJobTemplateDetail
|
||||
resource={{
|
||||
...mockJT,
|
||||
skip_tags: '',
|
||||
}}
|
||||
/>
|
||||
);
|
||||
expect(wrapper.find('Detail[label="Skip Tags"]').length).toBe(0);
|
||||
});
|
||||
});
|
||||
|
||||
@@ -57,9 +57,11 @@ function PromptWFJobTemplateDetail({ resource }) {
|
||||
|
||||
return (
|
||||
<>
|
||||
{summary_fields?.recent_jobs?.length > 0 && (
|
||||
<Detail value={<Sparkline jobs={recentJobs} />} label={t`Activity`} />
|
||||
)}
|
||||
<Detail
|
||||
label={t`Activity`}
|
||||
value={<Sparkline jobs={recentJobs} />}
|
||||
isEmpty={summary_fields?.recent_jobs?.length === 0}
|
||||
/>
|
||||
{summary_fields?.organization && (
|
||||
<Detail
|
||||
label={t`Organization`}
|
||||
@@ -108,7 +110,7 @@ function PromptWFJobTemplateDetail({ resource }) {
|
||||
}
|
||||
/>
|
||||
)}
|
||||
{summary_fields?.labels?.results?.length > 0 && (
|
||||
{summary_fields?.labels?.results && (
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Labels`}
|
||||
@@ -125,6 +127,7 @@ function PromptWFJobTemplateDetail({ resource }) {
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
isEmpty={summary_fields?.labels?.results?.length === 0}
|
||||
/>
|
||||
)}
|
||||
{extra_vars && (
|
||||
|
||||
@@ -62,4 +62,36 @@ describe('PromptWFJobTemplateDetail', () => {
|
||||
'---\nmock: data'
|
||||
);
|
||||
});
|
||||
|
||||
test('should not load Activity', () => {
|
||||
wrapper = mountWithContexts(
|
||||
<PromptWFJobTemplateDetail
|
||||
resource={{
|
||||
...mockWF,
|
||||
summary_fields: {
|
||||
recent_jobs: [],
|
||||
},
|
||||
}}
|
||||
/>
|
||||
);
|
||||
const activity_detail = wrapper.find(`Detail[label="Activity"]`).at(0);
|
||||
expect(activity_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
|
||||
test('should not load Labels', () => {
|
||||
wrapper = mountWithContexts(
|
||||
<PromptWFJobTemplateDetail
|
||||
resource={{
|
||||
...mockWF,
|
||||
summary_fields: {
|
||||
labels: {
|
||||
results: [],
|
||||
},
|
||||
},
|
||||
}}
|
||||
/>
|
||||
);
|
||||
const labels_detail = wrapper.find(`Detail[label="Labels"]`).at(0);
|
||||
expect(labels_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
});
|
||||
|
||||
@@ -112,7 +112,6 @@
|
||||
"update_on_launch":true,
|
||||
"update_cache_timeout":2,
|
||||
"source_project":8,
|
||||
"update_on_project_update":true,
|
||||
"last_update_failed": true,
|
||||
"last_updated":null
|
||||
}
|
||||
@@ -68,34 +68,32 @@ function ResourceAccessListItem({ accessRecord, onRoleDelete }) {
|
||||
<Td dataLabel={t`Last name`}>{accessRecord.last_name}</Td>
|
||||
<Td dataLabel={t`Roles`}>
|
||||
<DetailList stacked>
|
||||
{userRoles.length > 0 && (
|
||||
<Detail
|
||||
label={t`User Roles`}
|
||||
value={
|
||||
<ChipGroup
|
||||
numChips={5}
|
||||
totalChips={userRoles.length}
|
||||
ouiaId="user-role-chips"
|
||||
>
|
||||
{userRoles.map(renderChip)}
|
||||
</ChipGroup>
|
||||
}
|
||||
/>
|
||||
)}
|
||||
{teamRoles.length > 0 && (
|
||||
<Detail
|
||||
label={t`Team Roles`}
|
||||
value={
|
||||
<ChipGroup
|
||||
numChips={5}
|
||||
totalChips={teamRoles.length}
|
||||
ouiaId="team-role-chips"
|
||||
>
|
||||
{teamRoles.map(renderChip)}
|
||||
</ChipGroup>
|
||||
}
|
||||
/>
|
||||
)}
|
||||
<Detail
|
||||
label={t`User Roles`}
|
||||
value={
|
||||
<ChipGroup
|
||||
numChips={5}
|
||||
totalChips={userRoles.length}
|
||||
ouiaId="user-role-chips"
|
||||
>
|
||||
{userRoles.map(renderChip)}
|
||||
</ChipGroup>
|
||||
}
|
||||
isEmpty={userRoles.length === 0}
|
||||
/>
|
||||
<Detail
|
||||
label={t`Team Roles`}
|
||||
value={
|
||||
<ChipGroup
|
||||
numChips={5}
|
||||
totalChips={teamRoles.length}
|
||||
ouiaId="team-role-chips"
|
||||
>
|
||||
{teamRoles.map(renderChip)}
|
||||
</ChipGroup>
|
||||
}
|
||||
isEmpty={teamRoles.length === 0}
|
||||
/>
|
||||
</DetailList>
|
||||
</Td>
|
||||
</Tr>
|
||||
|
||||
@@ -53,5 +53,41 @@ describe('<ResourceAccessListItem />', () => {
|
||||
|
||||
expect(wrapper.find('Td[dataLabel="First name"]').text()).toBe('jane');
|
||||
expect(wrapper.find('Td[dataLabel="Last name"]').text()).toBe('brown');
|
||||
|
||||
const user_roles_detail = wrapper.find(`Detail[label="User Roles"]`).at(0);
|
||||
expect(user_roles_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
|
||||
test('should not load team roles', async () => {
|
||||
let wrapper;
|
||||
|
||||
await act(async () => {
|
||||
wrapper = mountWithContexts(
|
||||
<table>
|
||||
<tbody>
|
||||
<ResourceAccessListItem
|
||||
accessRecord={{
|
||||
...accessRecord,
|
||||
summary_fields: {
|
||||
direct_access: [
|
||||
{
|
||||
role: {
|
||||
id: 3,
|
||||
name: 'Member',
|
||||
user_capabilities: { unattach: true },
|
||||
},
|
||||
},
|
||||
],
|
||||
indirect_access: [],
|
||||
},
|
||||
}}
|
||||
onRoleDelete={() => {}}
|
||||
/>
|
||||
</tbody>
|
||||
</table>
|
||||
);
|
||||
});
|
||||
const team_roles_detail = wrapper.find(`Detail[label="Team Roles"]`).at(0);
|
||||
expect(team_roles_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
});
|
||||
|
||||
@@ -272,11 +272,12 @@ function TemplateListItem({
|
||||
value={template.description}
|
||||
dataCy={`template-${template.id}-description`}
|
||||
/>
|
||||
{summaryFields.recent_jobs && summaryFields.recent_jobs.length ? (
|
||||
{summaryFields.recent_jobs ? (
|
||||
<Detail
|
||||
label={t`Activity`}
|
||||
value={<Sparkline jobs={summaryFields.recent_jobs} />}
|
||||
dataCy={`template-${template.id}-activity`}
|
||||
isEmpty={summaryFields.recent_jobs.length === 0}
|
||||
/>
|
||||
) : null}
|
||||
{summaryFields.inventory ? (
|
||||
@@ -316,7 +317,7 @@ function TemplateListItem({
|
||||
value={formatDateString(template.modified)}
|
||||
dataCy={`template-${template.id}-last-modified`}
|
||||
/>
|
||||
{summaryFields.credentials && summaryFields.credentials.length ? (
|
||||
{summaryFields.credentials ? (
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Credentials`}
|
||||
@@ -337,9 +338,10 @@ function TemplateListItem({
|
||||
</ChipGroup>
|
||||
}
|
||||
dataCy={`template-${template.id}-credentials`}
|
||||
isEmpty={summaryFields.credentials.length === 0}
|
||||
/>
|
||||
) : null}
|
||||
{summaryFields.labels && summaryFields.labels.results.length > 0 && (
|
||||
{summaryFields.labels && (
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Labels`}
|
||||
@@ -361,6 +363,7 @@ function TemplateListItem({
|
||||
</ChipGroup>
|
||||
}
|
||||
dataCy={`template-${template.id}-labels`}
|
||||
isEmpty={summaryFields.labels.results.length === 0}
|
||||
/>
|
||||
)}
|
||||
</DetailList>
|
||||
|
||||
@@ -465,4 +465,68 @@ describe('<TemplateListItem />', () => {
|
||||
).toEqual(true);
|
||||
expect(wrapper.find(`Detail[label="Activity"] Sparkline`)).toHaveLength(1);
|
||||
});
|
||||
|
||||
test('should not load Activity', async () => {
|
||||
const wrapper = mountWithContexts(
|
||||
<table>
|
||||
<tbody>
|
||||
<TemplateListItem
|
||||
template={{
|
||||
...mockJobTemplateData,
|
||||
summary_fields: {
|
||||
user_capabilities: {},
|
||||
recent_jobs: [],
|
||||
},
|
||||
}}
|
||||
/>
|
||||
</tbody>
|
||||
</table>
|
||||
);
|
||||
const activity_detail = wrapper.find(`Detail[label="Activity"]`).at(0);
|
||||
expect(activity_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
|
||||
test('should not load Credentials', async () => {
|
||||
const wrapper = mountWithContexts(
|
||||
<table>
|
||||
<tbody>
|
||||
<TemplateListItem
|
||||
template={{
|
||||
...mockJobTemplateData,
|
||||
summary_fields: {
|
||||
user_capabilities: {},
|
||||
credentials: [],
|
||||
},
|
||||
}}
|
||||
/>
|
||||
</tbody>
|
||||
</table>
|
||||
);
|
||||
const credentials_detail = wrapper
|
||||
.find(`Detail[label="Credentials"]`)
|
||||
.at(0);
|
||||
expect(credentials_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
|
||||
test('should not load Labels', async () => {
|
||||
const wrapper = mountWithContexts(
|
||||
<table>
|
||||
<tbody>
|
||||
<TemplateListItem
|
||||
template={{
|
||||
...mockJobTemplateData,
|
||||
summary_fields: {
|
||||
user_capabilities: {},
|
||||
labels: {
|
||||
results: [],
|
||||
},
|
||||
},
|
||||
}}
|
||||
/>
|
||||
</tbody>
|
||||
</table>
|
||||
);
|
||||
const labels_detail = wrapper.find(`Detail[label="Labels"]`).at(0);
|
||||
expect(labels_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
});
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
@@ -264,20 +264,19 @@ function CredentialDetail({ credential }) {
|
||||
date={modified}
|
||||
user={modified_by}
|
||||
/>
|
||||
{enabledBooleanFields.length > 0 && (
|
||||
<Detail
|
||||
label={t`Enabled Options`}
|
||||
value={
|
||||
<TextList component={TextListVariants.ul}>
|
||||
{enabledBooleanFields.map(({ id, label }) => (
|
||||
<TextListItem key={id} component={TextListItemVariants.li}>
|
||||
{label}
|
||||
</TextListItem>
|
||||
))}
|
||||
</TextList>
|
||||
}
|
||||
/>
|
||||
)}
|
||||
<Detail
|
||||
label={t`Enabled Options`}
|
||||
value={
|
||||
<TextList component={TextListVariants.ul}>
|
||||
{enabledBooleanFields.map(({ id, label }) => (
|
||||
<TextListItem key={id} component={TextListItemVariants.li}>
|
||||
{label}
|
||||
</TextListItem>
|
||||
))}
|
||||
</TextList>
|
||||
}
|
||||
isEmpty={enabledBooleanFields.length === 0}
|
||||
/>
|
||||
</DetailList>
|
||||
{Object.keys(inputSources).length > 0 && (
|
||||
<PluginFieldText>
|
||||
|
||||
@@ -149,4 +149,23 @@ describe('<CredentialDetail />', () => {
|
||||
wrapper.find('ModalBoxCloseButton').invoke('onClose')();
|
||||
});
|
||||
});
|
||||
|
||||
test('should not load enabled options', async () => {
|
||||
await act(async () => {
|
||||
wrapper = mountWithContexts(
|
||||
<CredentialDetail
|
||||
credential={{
|
||||
...mockCredential,
|
||||
results: {
|
||||
inputs: null,
|
||||
},
|
||||
}}
|
||||
/>
|
||||
);
|
||||
});
|
||||
const enabled_options_detail = wrapper
|
||||
.find(`Detail[label="Enabled Options"]`)
|
||||
.at(0);
|
||||
expect(enabled_options_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
});
|
||||
|
||||
@@ -67,9 +67,11 @@ function HostDetail({ host }) {
|
||||
<HostToggle host={host} css="padding-bottom: 40px" />
|
||||
<DetailList gutter="sm">
|
||||
<Detail label={t`Name`} value={name} dataCy="host-name" />
|
||||
{recentJobs?.length > 0 && (
|
||||
<Detail label={t`Activity`} value={<Sparkline jobs={recentJobs} />} />
|
||||
)}
|
||||
<Detail
|
||||
label={t`Activity`}
|
||||
value={<Sparkline jobs={recentJobs} />}
|
||||
isEmpty={recentJobs?.length === 0}
|
||||
/>
|
||||
<Detail label={t`Description`} value={description} />
|
||||
<Detail
|
||||
label={t`Inventory`}
|
||||
|
||||
@@ -81,6 +81,8 @@ describe('<HostDetail />', () => {
|
||||
expect(wrapper.find(`Detail[label="Activity"] Sparkline`)).toHaveLength(
|
||||
0
|
||||
);
|
||||
const activity_detail = wrapper.find(`Detail[label="Activity"]`).at(0);
|
||||
expect(activity_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
|
||||
test('should hide edit button for users without edit permission', async () => {
|
||||
|
||||
@@ -79,17 +79,17 @@ function InventoryDetail({ inventory }) {
|
||||
}
|
||||
/>
|
||||
<Detail label={t`Total hosts`} value={inventory.total_hosts} />
|
||||
{instanceGroups && instanceGroups.length > 0 && (
|
||||
{instanceGroups && (
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Instance Groups`}
|
||||
value={
|
||||
<ChipGroup
|
||||
numChips={5}
|
||||
totalChips={instanceGroups.length}
|
||||
totalChips={instanceGroups?.length}
|
||||
ouiaId="instance-group-chips"
|
||||
>
|
||||
{instanceGroups.map((ig) => (
|
||||
{instanceGroups?.map((ig) => (
|
||||
<Chip
|
||||
key={ig.id}
|
||||
isReadOnly
|
||||
@@ -100,28 +100,29 @@ function InventoryDetail({ inventory }) {
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
isEmpty={instanceGroups.length === 0}
|
||||
/>
|
||||
)}
|
||||
{inventory.summary_fields.labels && (
|
||||
<Detail
|
||||
fullWidth
|
||||
helpText={helpText.labels}
|
||||
label={t`Labels`}
|
||||
value={
|
||||
<ChipGroup
|
||||
numChips={5}
|
||||
totalChips={inventory.summary_fields.labels?.results?.length}
|
||||
>
|
||||
{inventory.summary_fields.labels?.results?.map((l) => (
|
||||
<Chip key={l.id} isReadOnly>
|
||||
{l.name}
|
||||
</Chip>
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
isEmpty={inventory.summary_fields.labels?.results?.length === 0}
|
||||
/>
|
||||
)}
|
||||
{inventory.summary_fields.labels &&
|
||||
inventory.summary_fields.labels?.results?.length > 0 && (
|
||||
<Detail
|
||||
fullWidth
|
||||
helpText={helpText.labels}
|
||||
label={t`Labels`}
|
||||
value={
|
||||
<ChipGroup
|
||||
numChips={5}
|
||||
totalChips={inventory.summary_fields.labels.results.length}
|
||||
>
|
||||
{inventory.summary_fields.labels.results.map((l) => (
|
||||
<Chip key={l.id} isReadOnly>
|
||||
{l.name}
|
||||
</Chip>
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
/>
|
||||
)}
|
||||
<VariablesDetail
|
||||
label={t`Variables`}
|
||||
helpText={helpText.variables()}
|
||||
|
||||
@@ -153,6 +153,9 @@ describe('<InventoryDetail />', () => {
|
||||
expect(InventoriesAPI.readInstanceGroups).toHaveBeenCalledWith(
|
||||
mockInventory.id
|
||||
);
|
||||
expect(wrapper.find(`Detail[label="Instance Groups"]`)).toHaveLength(0);
|
||||
const instance_groups_detail = wrapper
|
||||
.find(`Detail[label="Instance Groups"]`)
|
||||
.at(0);
|
||||
expect(instance_groups_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
});
|
||||
|
||||
@@ -72,12 +72,11 @@ function InventoryHostDetail({ host }) {
|
||||
<HostToggle host={host} css="padding-bottom: 40px" />
|
||||
<DetailList gutter="sm">
|
||||
<Detail label={t`Name`} value={name} />
|
||||
{recentPlaybookJobs?.length > 0 && (
|
||||
<Detail
|
||||
label={t`Activity`}
|
||||
value={<Sparkline jobs={recentPlaybookJobs} />}
|
||||
/>
|
||||
)}
|
||||
<Detail
|
||||
label={t`Activity`}
|
||||
value={<Sparkline jobs={recentPlaybookJobs} />}
|
||||
isEmpty={recentPlaybookJobs?.length === 0}
|
||||
/>
|
||||
<Detail label={t`Description`} value={description} />
|
||||
<UserDateDetail date={created} label={t`Created`} user={created_by} />
|
||||
<UserDateDetail
|
||||
|
||||
@@ -91,6 +91,8 @@ describe('<InventoryHostDetail />', () => {
|
||||
expect(wrapper.find(`Detail[label="Activity"] Sparkline`)).toHaveLength(
|
||||
0
|
||||
);
|
||||
const activity_detail = wrapper.find(`Detail[label="Activity"]`).at(0);
|
||||
expect(activity_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
|
||||
test('should hide edit button for users without edit permission', async () => {
|
||||
|
||||
@@ -216,7 +216,7 @@ function InventoryList() {
|
||||
headerRow={
|
||||
<HeaderRow qsConfig={QS_CONFIG}>
|
||||
<HeaderCell sortKey="name">{t`Name`}</HeaderCell>
|
||||
<HeaderCell>{t`Status`}</HeaderCell>
|
||||
<HeaderCell>{t`Sync Status`}</HeaderCell>
|
||||
<HeaderCell>{t`Type`}</HeaderCell>
|
||||
<HeaderCell>{t`Organization`}</HeaderCell>
|
||||
<HeaderCell>{t`Actions`}</HeaderCell>
|
||||
|
||||
@@ -31,7 +31,6 @@ describe('<InventorySourceAdd />', () => {
|
||||
source_vars: '---↵',
|
||||
update_cache_timeout: 0,
|
||||
update_on_launch: false,
|
||||
update_on_project_update: false,
|
||||
verbosity: 1,
|
||||
};
|
||||
|
||||
|
||||
@@ -49,7 +49,6 @@ function InventorySourceDetail({ inventorySource }) {
|
||||
source_vars,
|
||||
update_cache_timeout,
|
||||
update_on_launch,
|
||||
update_on_project_update,
|
||||
verbosity,
|
||||
enabled_var,
|
||||
enabled_value,
|
||||
@@ -113,12 +112,7 @@ function InventorySourceDetail({ inventorySource }) {
|
||||
);
|
||||
|
||||
let optionsList = '';
|
||||
if (
|
||||
overwrite ||
|
||||
overwrite_vars ||
|
||||
update_on_launch ||
|
||||
update_on_project_update
|
||||
) {
|
||||
if (overwrite || overwrite_vars || update_on_launch) {
|
||||
optionsList = (
|
||||
<TextList component={TextListVariants.ul}>
|
||||
{overwrite && (
|
||||
@@ -143,16 +137,6 @@ function InventorySourceDetail({ inventorySource }) {
|
||||
/>
|
||||
</TextListItem>
|
||||
)}
|
||||
{update_on_project_update && (
|
||||
<TextListItem component={TextListItemVariants.li}>
|
||||
{t`Update on project update`}
|
||||
<Popover
|
||||
content={helpText.subFormOptions.updateOnProjectUpdate({
|
||||
value: source_project,
|
||||
})}
|
||||
/>
|
||||
</TextListItem>
|
||||
)}
|
||||
</TextList>
|
||||
);
|
||||
}
|
||||
@@ -268,15 +252,14 @@ function InventorySourceDetail({ inventorySource }) {
|
||||
helpText={helpText.enabledValue}
|
||||
value={enabled_value}
|
||||
/>
|
||||
{credentials?.length > 0 && (
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Credential`}
|
||||
value={credentials.map((cred) => (
|
||||
<CredentialChip key={cred?.id} credential={cred} isReadOnly />
|
||||
))}
|
||||
/>
|
||||
)}
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Credential`}
|
||||
value={credentials?.map((cred) => (
|
||||
<CredentialChip key={cred?.id} credential={cred} isReadOnly />
|
||||
))}
|
||||
isEmpty={credentials?.length === 0}
|
||||
/>
|
||||
{optionsList && (
|
||||
<Detail fullWidth label={t`Enabled Options`} value={optionsList} />
|
||||
)}
|
||||
|
||||
@@ -113,7 +113,6 @@ describe('InventorySourceDetail', () => {
|
||||
'Overwrite local groups and hosts from remote inventory source',
|
||||
'Overwrite local variables from remote inventory source',
|
||||
'Update on launch',
|
||||
'Update on project update',
|
||||
]).toContain(option.text());
|
||||
});
|
||||
});
|
||||
@@ -237,4 +236,21 @@ describe('InventorySourceDetail', () => {
|
||||
(el) => el.length === 0
|
||||
);
|
||||
});
|
||||
|
||||
test('should not load Credentials', async () => {
|
||||
await act(async () => {
|
||||
wrapper = mountWithContexts(
|
||||
<InventorySourceDetail
|
||||
inventorySource={{
|
||||
...mockInvSource,
|
||||
summary_fields: {
|
||||
credentials: [],
|
||||
},
|
||||
}}
|
||||
/>
|
||||
);
|
||||
});
|
||||
const credentials_detail = wrapper.find(`Detail[label="Credential"]`).at(0);
|
||||
expect(credentials_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
});
|
||||
|
||||
@@ -31,7 +31,6 @@ describe('<InventorySourceEdit />', () => {
|
||||
source_vars: '---↵',
|
||||
update_cache_timeout: 0,
|
||||
update_on_launch: false,
|
||||
update_on_project_update: false,
|
||||
verbosity: 1,
|
||||
};
|
||||
const mockInventory = {
|
||||
|
||||
@@ -96,12 +96,11 @@ function SmartInventoryDetail({ inventory }) {
|
||||
<CardBody>
|
||||
<DetailList>
|
||||
<Detail label={t`Name`} value={name} />
|
||||
{recentJobs.length > 0 && (
|
||||
<Detail
|
||||
label={t`Activity`}
|
||||
value={<Sparkline jobs={recentJobs} />}
|
||||
/>
|
||||
)}
|
||||
<Detail
|
||||
label={t`Activity`}
|
||||
value={<Sparkline jobs={recentJobs} />}
|
||||
isEmpty={recentJobs.length === 0}
|
||||
/>
|
||||
<Detail label={t`Description`} value={description} />
|
||||
<Detail label={t`Type`} value={t`Smart inventory`} />
|
||||
<Detail
|
||||
@@ -118,29 +117,28 @@ function SmartInventoryDetail({ inventory }) {
|
||||
value={<Label variant="outline">{host_filter}</Label>}
|
||||
/>
|
||||
<Detail label={t`Total hosts`} value={total_hosts} />
|
||||
{instanceGroups.length > 0 && (
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Instance groups`}
|
||||
value={
|
||||
<ChipGroup
|
||||
numChips={5}
|
||||
totalChips={instanceGroups.length}
|
||||
ouiaId="instance-group-chips"
|
||||
>
|
||||
{instanceGroups.map((ig) => (
|
||||
<Chip
|
||||
key={ig.id}
|
||||
isReadOnly
|
||||
ouiaId={`instance-group-${ig.id}-chip`}
|
||||
>
|
||||
{ig.name}
|
||||
</Chip>
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
/>
|
||||
)}
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Instance groups`}
|
||||
value={
|
||||
<ChipGroup
|
||||
numChips={5}
|
||||
totalChips={instanceGroups.length}
|
||||
ouiaId="instance-group-chips"
|
||||
>
|
||||
{instanceGroups.map((ig) => (
|
||||
<Chip
|
||||
key={ig.id}
|
||||
isReadOnly
|
||||
ouiaId={`instance-group-${ig.id}-chip`}
|
||||
>
|
||||
{ig.name}
|
||||
</Chip>
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
isEmpty={instanceGroups.length === 0}
|
||||
/>
|
||||
<VariablesDetail
|
||||
label={t`Variables`}
|
||||
value={variables}
|
||||
|
||||
@@ -112,6 +112,41 @@ describe('<SmartInventoryDetail />', () => {
|
||||
(el) => el.length === 0
|
||||
);
|
||||
});
|
||||
|
||||
test('should not load Activity', async () => {
|
||||
await act(async () => {
|
||||
wrapper = mountWithContexts(
|
||||
<SmartInventoryDetail
|
||||
inventory={{
|
||||
...mockSmartInventory,
|
||||
recent_jobs: [],
|
||||
}}
|
||||
/>
|
||||
);
|
||||
});
|
||||
const activity_detail = wrapper.find(`Detail[label="Activity"]`).at(0);
|
||||
expect(activity_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
|
||||
test('should not load Instance Groups', async () => {
|
||||
InventoriesAPI.readInstanceGroups.mockResolvedValue({
|
||||
data: {
|
||||
results: [],
|
||||
},
|
||||
});
|
||||
|
||||
let wrapper;
|
||||
await act(async () => {
|
||||
wrapper = mountWithContexts(
|
||||
<SmartInventoryDetail inventory={mockSmartInventory} />
|
||||
);
|
||||
});
|
||||
wrapper.update();
|
||||
const instance_groups_detail = wrapper
|
||||
.find(`Detail[label="Instance groups"]`)
|
||||
.at(0);
|
||||
expect(instance_groups_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
});
|
||||
|
||||
describe('User has read-only permissions', () => {
|
||||
|
||||
@@ -28,12 +28,11 @@ function SmartInventoryHostDetail({ host }) {
|
||||
<CardBody>
|
||||
<DetailList gutter="sm">
|
||||
<Detail label={t`Name`} value={name} />
|
||||
{recentPlaybookJobs?.length > 0 && (
|
||||
<Detail
|
||||
label={t`Activity`}
|
||||
value={<Sparkline jobs={recentPlaybookJobs} />}
|
||||
/>
|
||||
)}
|
||||
<Detail
|
||||
label={t`Activity`}
|
||||
value={<Sparkline jobs={recentPlaybookJobs} />}
|
||||
isEmpty={recentPlaybookJobs?.length === 0}
|
||||
/>
|
||||
<Detail label={t`Description`} value={description} />
|
||||
<Detail
|
||||
label={t`Inventory`}
|
||||
|
||||
@@ -27,4 +27,19 @@ describe('<SmartInventoryHostDetail />', () => {
|
||||
expect(wrapper.find('Detail[label="Activity"] Sparkline')).toHaveLength(1);
|
||||
expect(wrapper.find('VariablesDetail')).toHaveLength(1);
|
||||
});
|
||||
|
||||
test('should not load Activity', () => {
|
||||
wrapper = mountWithContexts(
|
||||
<SmartInventoryHostDetail
|
||||
host={{
|
||||
...mockHost,
|
||||
summary_fields: {
|
||||
recent_jobs: [],
|
||||
},
|
||||
}}
|
||||
/>
|
||||
);
|
||||
const activity_detail = wrapper.find(`Detail[label="Activity"]`).at(0);
|
||||
expect(activity_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
});
|
||||
|
||||
@@ -73,7 +73,6 @@ const InventorySourceFormFields = ({
|
||||
source_vars: '---\n',
|
||||
update_cache_timeout: 0,
|
||||
update_on_launch: false,
|
||||
update_on_project_update: false,
|
||||
verbosity: 1,
|
||||
enabled_var: '',
|
||||
enabled_value: '',
|
||||
@@ -251,7 +250,6 @@ const InventorySourceForm = ({
|
||||
source_vars: source?.source_vars || '---\n',
|
||||
update_cache_timeout: source?.update_cache_timeout || 0,
|
||||
update_on_launch: source?.update_on_launch || false,
|
||||
update_on_project_update: source?.update_on_project_update || false,
|
||||
verbosity: source?.verbosity || 1,
|
||||
enabled_var: source?.enabled_var || '',
|
||||
enabled_value: source?.enabled_value || '',
|
||||
|
||||
@@ -17,7 +17,6 @@ const initialValues = {
|
||||
source_vars: '---\n',
|
||||
update_cache_timeout: 0,
|
||||
update_on_launch: true,
|
||||
update_on_project_update: false,
|
||||
verbosity: 1,
|
||||
};
|
||||
|
||||
|
||||
@@ -17,7 +17,6 @@ const initialValues = {
|
||||
source_vars: '---\n',
|
||||
update_cache_timeout: 0,
|
||||
update_on_launch: true,
|
||||
update_on_project_update: false,
|
||||
verbosity: 1,
|
||||
};
|
||||
|
||||
|
||||
@@ -17,7 +17,6 @@ const initialValues = {
|
||||
source_vars: '---\n',
|
||||
update_cache_timeout: 0,
|
||||
update_on_launch: true,
|
||||
update_on_project_update: false,
|
||||
verbosity: 1,
|
||||
};
|
||||
|
||||
|
||||
@@ -17,7 +17,6 @@ const initialValues = {
|
||||
source_vars: '---\n',
|
||||
update_cache_timeout: 0,
|
||||
update_on_launch: true,
|
||||
update_on_project_update: false,
|
||||
verbosity: 1,
|
||||
};
|
||||
|
||||
|
||||
@@ -17,7 +17,6 @@ const initialValues = {
|
||||
source_vars: '---\n',
|
||||
update_cache_timeout: 0,
|
||||
update_on_launch: true,
|
||||
update_on_project_update: false,
|
||||
verbosity: 1,
|
||||
};
|
||||
|
||||
|
||||
@@ -17,7 +17,6 @@ const initialValues = {
|
||||
source_vars: '---\n',
|
||||
update_cache_timeout: 0,
|
||||
update_on_launch: true,
|
||||
update_on_project_update: false,
|
||||
verbosity: 1,
|
||||
};
|
||||
|
||||
|
||||
@@ -17,7 +17,6 @@ const initialValues = {
|
||||
source_vars: '---\n',
|
||||
update_cache_timeout: 0,
|
||||
update_on_launch: true,
|
||||
update_on_project_update: false,
|
||||
verbosity: 1,
|
||||
};
|
||||
|
||||
@@ -121,7 +120,6 @@ describe('<SCMSubForm />', () => {
|
||||
source_vars: '---\n',
|
||||
update_cache_timeout: 0,
|
||||
update_on_launch: true,
|
||||
update_on_project_update: false,
|
||||
verbosity: 1,
|
||||
};
|
||||
let customWrapper;
|
||||
@@ -139,63 +137,4 @@ describe('<SCMSubForm />', () => {
|
||||
customWrapper.update();
|
||||
expect(customWrapper.find('Select').prop('selections')).toBe('newPath');
|
||||
});
|
||||
test('Update on project update should be disabled', async () => {
|
||||
const customInitialValues = {
|
||||
credential: { id: 1, name: 'Credential' },
|
||||
custom_virtualenv: '',
|
||||
overwrite: false,
|
||||
overwrite_vars: false,
|
||||
source_path: '/path',
|
||||
source_project: { id: 1, name: 'Source project' },
|
||||
source_script: null,
|
||||
source_vars: '---\n',
|
||||
update_cache_timeout: 0,
|
||||
update_on_launch: true,
|
||||
update_on_project_update: false,
|
||||
verbosity: 1,
|
||||
};
|
||||
let customWrapper;
|
||||
await act(async () => {
|
||||
customWrapper = mountWithContexts(
|
||||
<Formik initialValues={customInitialValues}>
|
||||
<SCMSubForm />
|
||||
</Formik>
|
||||
);
|
||||
});
|
||||
expect(
|
||||
customWrapper
|
||||
.find('Checkbox[aria-label="Update on project update"]')
|
||||
.prop('isDisabled')
|
||||
).toBe(true);
|
||||
});
|
||||
|
||||
test('Update on launch should be disabled', async () => {
|
||||
const customInitialValues = {
|
||||
credential: { id: 1, name: 'Credential' },
|
||||
custom_virtualenv: '',
|
||||
overwrite: false,
|
||||
overwrite_vars: false,
|
||||
source_path: '/path',
|
||||
source_project: { id: 1, name: 'Source project' },
|
||||
source_script: null,
|
||||
source_vars: '---\n',
|
||||
update_cache_timeout: 0,
|
||||
update_on_launch: false,
|
||||
update_on_project_update: true,
|
||||
verbosity: 1,
|
||||
};
|
||||
let customWrapper;
|
||||
await act(async () => {
|
||||
customWrapper = mountWithContexts(
|
||||
<Formik initialValues={customInitialValues}>
|
||||
<SCMSubForm />
|
||||
</Formik>
|
||||
);
|
||||
});
|
||||
expect(
|
||||
customWrapper
|
||||
.find('Checkbox[aria-label="Update on launch"]')
|
||||
.prop('isDisabled')
|
||||
).toBe(true);
|
||||
});
|
||||
});
|
||||
|
||||
@@ -17,7 +17,6 @@ const initialValues = {
|
||||
source_vars: '---\n',
|
||||
update_cache_timeout: 0,
|
||||
update_on_launch: true,
|
||||
update_on_project_update: false,
|
||||
verbosity: 1,
|
||||
};
|
||||
|
||||
|
||||
@@ -53,11 +53,10 @@ export const VerbosityField = () => {
|
||||
);
|
||||
};
|
||||
|
||||
export const OptionsField = ({ showProjectUpdate = false }) => {
|
||||
export const OptionsField = () => {
|
||||
const [updateOnLaunchField] = useField('update_on_launch');
|
||||
const [, , updateCacheTimeoutHelper] = useField('update_cache_timeout');
|
||||
const [projectField] = useField('source_project');
|
||||
const [updatedOnProjectUpdateField] = useField('update_on_project_update');
|
||||
|
||||
useEffect(() => {
|
||||
if (!updateOnLaunchField.value) {
|
||||
@@ -83,23 +82,11 @@ export const OptionsField = ({ showProjectUpdate = false }) => {
|
||||
tooltip={helpText.subFormOptions.overwriteVariables}
|
||||
/>
|
||||
<CheckboxField
|
||||
isDisabled={updatedOnProjectUpdateField.value}
|
||||
id="update_on_launch"
|
||||
name="update_on_launch"
|
||||
label={t`Update on launch`}
|
||||
tooltip={helpText.subFormOptions.updateOnLaunch(projectField)}
|
||||
/>
|
||||
{showProjectUpdate && (
|
||||
<CheckboxField
|
||||
isDisabled={updateOnLaunchField.value}
|
||||
id="update_on_project_update"
|
||||
name="update_on_project_update"
|
||||
label={t`Update on project update`}
|
||||
tooltip={helpText.subFormOptions.updateOnProjectUpdate(
|
||||
projectField
|
||||
)}
|
||||
/>
|
||||
)}
|
||||
</FormCheckboxLayout>
|
||||
</FormGroup>
|
||||
</FormFullWidthLayout>
|
||||
|
||||
@@ -17,7 +17,6 @@ const initialValues = {
|
||||
source_vars: '---\n',
|
||||
update_cache_timeout: 0,
|
||||
update_on_launch: true,
|
||||
update_on_project_update: false,
|
||||
verbosity: 1,
|
||||
};
|
||||
|
||||
|
||||
@@ -17,7 +17,6 @@ const initialValues = {
|
||||
source_vars: '---\n',
|
||||
update_cache_timeout: 0,
|
||||
update_on_launch: true,
|
||||
update_on_project_update: false,
|
||||
verbosity: 1,
|
||||
};
|
||||
|
||||
|
||||
@@ -115,7 +115,6 @@
|
||||
"update_on_launch":true,
|
||||
"update_cache_timeout":2,
|
||||
"source_project":8,
|
||||
"update_on_project_update":true,
|
||||
"last_update_failed": true,
|
||||
"last_updated":null,
|
||||
"execution_environment": 1
|
||||
|
||||
@@ -268,15 +268,14 @@ function JobDetail({ job, inventorySourceLabels }) {
|
||||
</Link>
|
||||
}
|
||||
/>
|
||||
{inventorySourceLabels.length > 0 && (
|
||||
<Detail
|
||||
dataCy="job-inventory-source-type"
|
||||
label={t`Source`}
|
||||
value={inventorySourceLabels.map(([string, label]) =>
|
||||
string === job.source ? label : null
|
||||
)}
|
||||
/>
|
||||
)}
|
||||
<Detail
|
||||
dataCy="job-inventory-source-type"
|
||||
label={t`Source`}
|
||||
value={inventorySourceLabels.map(([string, label]) =>
|
||||
string === job.source ? label : null
|
||||
)}
|
||||
isEmpty={inventorySourceLabels.length === 0}
|
||||
/>
|
||||
</>
|
||||
)}
|
||||
{inventory_source && inventory_source.source === 'scm' && (
|
||||
@@ -406,7 +405,7 @@ function JobDetail({ job, inventorySourceLabels }) {
|
||||
}
|
||||
/>
|
||||
)}
|
||||
{credentials && credentials.length > 0 && (
|
||||
{credentials && (
|
||||
<Detail
|
||||
dataCy="job-credentials"
|
||||
fullWidth
|
||||
@@ -428,6 +427,7 @@ function JobDetail({ job, inventorySourceLabels }) {
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
isEmpty={credentials.length === 0}
|
||||
/>
|
||||
)}
|
||||
{labels && labels.count > 0 && (
|
||||
@@ -451,7 +451,7 @@ function JobDetail({ job, inventorySourceLabels }) {
|
||||
}
|
||||
/>
|
||||
)}
|
||||
{job.job_tags && job.job_tags.length > 0 && (
|
||||
{job.job_tags && (
|
||||
<Detail
|
||||
dataCy="job-tags"
|
||||
fullWidth
|
||||
@@ -474,9 +474,10 @@ function JobDetail({ job, inventorySourceLabels }) {
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
isEmpty={job.job_tags.length === 0}
|
||||
/>
|
||||
)}
|
||||
{job.skip_tags && job.skip_tags.length > 0 && (
|
||||
{job.skip_tags && (
|
||||
<Detail
|
||||
dataCy="job-skip-tags"
|
||||
fullWidth
|
||||
@@ -499,6 +500,7 @@ function JobDetail({ job, inventorySourceLabels }) {
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
isEmpty={job.skip_tags.length === 0}
|
||||
/>
|
||||
)}
|
||||
<Detail
|
||||
|
||||
@@ -548,4 +548,64 @@ describe('<JobDetail />', () => {
|
||||
assertDetail('Inventory', 'Demo Inventory');
|
||||
assertDetail('Job Slice Parent', 'True');
|
||||
});
|
||||
|
||||
test('should not load Source', () => {
|
||||
wrapper = mountWithContexts(
|
||||
<JobDetail
|
||||
job={{
|
||||
...mockJobData,
|
||||
summary_fields: {
|
||||
inventory_source: {},
|
||||
user_capabilities: {},
|
||||
inventory: { id: 1 },
|
||||
},
|
||||
}}
|
||||
inventorySourceLabels={[]}
|
||||
/>
|
||||
);
|
||||
const source_detail = wrapper.find(`Detail[label="Source"]`).at(0);
|
||||
expect(source_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
|
||||
test('should not load Credentials', () => {
|
||||
wrapper = mountWithContexts(
|
||||
<JobDetail
|
||||
job={{
|
||||
...mockJobData,
|
||||
summary_fields: {
|
||||
user_capabilities: {},
|
||||
credentials: [],
|
||||
},
|
||||
}}
|
||||
/>
|
||||
);
|
||||
const credentials_detail = wrapper
|
||||
.find(`Detail[label="Credentials"]`)
|
||||
.at(0);
|
||||
expect(credentials_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
|
||||
test('should not load Job Tags', () => {
|
||||
wrapper = mountWithContexts(
|
||||
<JobDetail
|
||||
job={{
|
||||
...mockJobData,
|
||||
job_tags: '',
|
||||
}}
|
||||
/>
|
||||
);
|
||||
expect(wrapper.find('Detail[label="Job Tags"]').length).toBe(0);
|
||||
});
|
||||
|
||||
test('should not load Skip Tags', () => {
|
||||
wrapper = mountWithContexts(
|
||||
<JobDetail
|
||||
job={{
|
||||
...mockJobData,
|
||||
skip_tags: '',
|
||||
}}
|
||||
/>
|
||||
);
|
||||
expect(wrapper.find('Detail[label="Skip Tags"]').length).toBe(0);
|
||||
});
|
||||
});
|
||||
|
||||
@@ -163,6 +163,11 @@ function JobOutput({ job, eventRelatedSearchableKeys, eventSearchableKeys }) {
|
||||
}
|
||||
addEvents(onReadyEvents);
|
||||
setOnReadyEvents([]);
|
||||
if (isFollowModeEnabled) {
|
||||
setTimeout(() => {
|
||||
scrollToEnd();
|
||||
}, 0);
|
||||
}
|
||||
}, [isTreeReady, onReadyEvents]); // eslint-disable-line react-hooks/exhaustive-deps
|
||||
|
||||
const totalNonCollapsedRows = Math.max(
|
||||
@@ -188,9 +193,6 @@ function JobOutput({ job, eventRelatedSearchableKeys, eventSearchableKeys }) {
|
||||
}, [location.search]); // eslint-disable-line react-hooks/exhaustive-deps
|
||||
|
||||
useEffect(() => {
|
||||
if (!isJobRunning(jobStatus)) {
|
||||
setIsFollowModeEnabled(false);
|
||||
}
|
||||
rebuildEventsTree();
|
||||
}, [isFlatMode]); // eslint-disable-line react-hooks/exhaustive-deps
|
||||
|
||||
@@ -242,7 +244,7 @@ function JobOutput({ job, eventRelatedSearchableKeys, eventSearchableKeys }) {
|
||||
if (data.group_name === `${job.type}_events`) {
|
||||
batchedEvents.push(data);
|
||||
clearTimeout(batchTimeout);
|
||||
if (batchedEvents.length >= 25) {
|
||||
if (batchedEvents.length >= 10) {
|
||||
addBatchedEvents();
|
||||
} else {
|
||||
batchTimeout = setTimeout(addBatchedEvents, 500);
|
||||
@@ -268,6 +270,12 @@ function JobOutput({ job, eventRelatedSearchableKeys, eventSearchableKeys }) {
|
||||
};
|
||||
}, [isJobRunning(jobStatus)]); // eslint-disable-line react-hooks/exhaustive-deps
|
||||
|
||||
useEffect(() => {
|
||||
if (isFollowModeEnabled) {
|
||||
scrollToEnd();
|
||||
}
|
||||
}, [wsEvents.length, isFollowModeEnabled]); // eslint-disable-line react-hooks/exhaustive-deps
|
||||
|
||||
useEffect(() => {
|
||||
if (listRef.current?.recomputeRowHeights) {
|
||||
listRef.current.recomputeRowHeights();
|
||||
@@ -419,9 +427,6 @@ function JobOutput({ job, eventRelatedSearchableKeys, eventSearchableKeys }) {
|
||||
};
|
||||
|
||||
const rowRenderer = ({ index, parent, key, style }) => {
|
||||
if (listRef.current && isFollowModeEnabled) {
|
||||
setTimeout(() => scrollToRow(remoteRowCount - 1), 0);
|
||||
}
|
||||
let event;
|
||||
let node;
|
||||
try {
|
||||
@@ -568,6 +573,9 @@ function JobOutput({ job, eventRelatedSearchableKeys, eventSearchableKeys }) {
|
||||
loadRange.forEach((n) => {
|
||||
cache.clear(n);
|
||||
});
|
||||
if (isFollowModeEnabled) {
|
||||
scrollToEnd();
|
||||
}
|
||||
};
|
||||
|
||||
const scrollToRow = (rowIndex) => {
|
||||
@@ -580,14 +588,16 @@ function JobOutput({ job, eventRelatedSearchableKeys, eventSearchableKeys }) {
|
||||
const handleScrollPrevious = () => {
|
||||
const startIndex = listRef.current.Grid._renderedRowStartIndex;
|
||||
const stopIndex = listRef.current.Grid._renderedRowStopIndex;
|
||||
const scrollRange = stopIndex - startIndex + 1;
|
||||
const scrollRange = stopIndex - startIndex;
|
||||
scrollToRow(Math.max(0, startIndex - scrollRange));
|
||||
setIsFollowModeEnabled(false);
|
||||
};
|
||||
|
||||
const handleScrollNext = () => {
|
||||
const startIndex = listRef.current.Grid._renderedRowStartIndex;
|
||||
const stopIndex = listRef.current.Grid._renderedRowStopIndex;
|
||||
scrollToRow(stopIndex - 1);
|
||||
const scrollRange = stopIndex - startIndex;
|
||||
scrollToRow(stopIndex + scrollRange);
|
||||
};
|
||||
|
||||
const handleScrollFirst = () => {
|
||||
@@ -595,8 +605,14 @@ function JobOutput({ job, eventRelatedSearchableKeys, eventSearchableKeys }) {
|
||||
setIsFollowModeEnabled(false);
|
||||
};
|
||||
|
||||
const scrollToEnd = () => {
|
||||
scrollToRow(-1);
|
||||
setTimeout(() => scrollToRow(-1), 100);
|
||||
};
|
||||
|
||||
const handleScrollLast = () => {
|
||||
scrollToRow(totalNonCollapsedRows + wsEvents.length);
|
||||
scrollToEnd();
|
||||
setIsFollowModeEnabled(true);
|
||||
};
|
||||
|
||||
const handleResize = ({ width }) => {
|
||||
@@ -619,6 +635,9 @@ function JobOutput({ job, eventRelatedSearchableKeys, eventSearchableKeys }) {
|
||||
}
|
||||
scrollTop.current = e.scrollTop;
|
||||
scrollHeight.current = e.scrollHeight;
|
||||
if (e.scrollTop + e.clientHeight >= e.scrollHeight) {
|
||||
setIsFollowModeEnabled(true);
|
||||
}
|
||||
};
|
||||
|
||||
const handleExpandCollapseAll = () => {
|
||||
@@ -658,8 +677,7 @@ function JobOutput({ job, eventRelatedSearchableKeys, eventSearchableKeys }) {
|
||||
job={job}
|
||||
eventRelatedSearchableKeys={eventRelatedSearchableKeys}
|
||||
eventSearchableKeys={eventSearchableKeys}
|
||||
remoteRowCount={remoteRowCount}
|
||||
scrollToRow={scrollToRow}
|
||||
scrollToEnd={scrollToEnd}
|
||||
isFollowModeEnabled={isFollowModeEnabled}
|
||||
setIsFollowModeEnabled={setIsFollowModeEnabled}
|
||||
/>
|
||||
@@ -718,7 +736,6 @@ function JobOutput({ job, eventRelatedSearchableKeys, eventSearchableKeys }) {
|
||||
rowCount={totalNonCollapsedRows + wsEvents.length}
|
||||
rowHeight={cache.rowHeight}
|
||||
rowRenderer={rowRenderer}
|
||||
scrollToAlignment="start"
|
||||
width={width || 1}
|
||||
overscanRowCount={20}
|
||||
onScroll={handleScroll}
|
||||
|
||||
@@ -30,8 +30,7 @@ function JobOutputSearch({
|
||||
job,
|
||||
eventRelatedSearchableKeys,
|
||||
eventSearchableKeys,
|
||||
remoteRowCount,
|
||||
scrollToRow,
|
||||
scrollToEnd,
|
||||
isFollowModeEnabled,
|
||||
setIsFollowModeEnabled,
|
||||
}) {
|
||||
@@ -83,7 +82,7 @@ function JobOutputSearch({
|
||||
setIsFollowModeEnabled(false);
|
||||
} else {
|
||||
setIsFollowModeEnabled(true);
|
||||
scrollToRow(remoteRowCount - 1);
|
||||
scrollToEnd();
|
||||
}
|
||||
};
|
||||
|
||||
|
||||
@@ -30,7 +30,7 @@ function OrganizationDetail({ organization }) {
|
||||
created,
|
||||
modified,
|
||||
summary_fields,
|
||||
galaxy_credentials,
|
||||
galaxy_credentials = [],
|
||||
} = organization;
|
||||
const [contentError, setContentError] = useState(null);
|
||||
const [hasContentLoading, setHasContentLoading] = useState(true);
|
||||
@@ -121,7 +121,7 @@ function OrganizationDetail({ organization }) {
|
||||
date={modified}
|
||||
user={summary_fields.modified_by}
|
||||
/>
|
||||
{instanceGroups && instanceGroups.length > 0 && (
|
||||
{instanceGroups && (
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Instance Groups`}
|
||||
@@ -145,35 +145,35 @@ function OrganizationDetail({ organization }) {
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
isEmpty={instanceGroups.length === 0}
|
||||
/>
|
||||
)}
|
||||
{galaxy_credentials && galaxy_credentials.length > 0 && (
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Galaxy Credentials`}
|
||||
value={
|
||||
<ChipGroup
|
||||
numChips={5}
|
||||
totalChips={galaxy_credentials.length}
|
||||
ouiaId="galaxy-credential-chips"
|
||||
>
|
||||
{galaxy_credentials.map((credential) => (
|
||||
<Link
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Galaxy Credentials`}
|
||||
value={
|
||||
<ChipGroup
|
||||
numChips={5}
|
||||
totalChips={galaxy_credentials?.length}
|
||||
ouiaId="galaxy-credential-chips"
|
||||
>
|
||||
{galaxy_credentials?.map((credential) => (
|
||||
<Link
|
||||
key={credential.id}
|
||||
to={`/credentials/${credential.id}/details`}
|
||||
>
|
||||
<CredentialChip
|
||||
credential={credential}
|
||||
key={credential.id}
|
||||
to={`/credentials/${credential.id}/details`}
|
||||
>
|
||||
<CredentialChip
|
||||
credential={credential}
|
||||
key={credential.id}
|
||||
isReadOnly
|
||||
ouiaId={`galaxy-credential-${credential.id}-chip`}
|
||||
/>
|
||||
</Link>
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
/>
|
||||
)}
|
||||
isReadOnly
|
||||
ouiaId={`galaxy-credential-${credential.id}-chip`}
|
||||
/>
|
||||
</Link>
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
isEmpty={galaxy_credentials?.length === 0}
|
||||
/>
|
||||
</DetailList>
|
||||
<CardActionsRow>
|
||||
{summary_fields.user_capabilities.edit && (
|
||||
|
||||
@@ -216,4 +216,44 @@ describe('<OrganizationDetail />', () => {
|
||||
(el) => el.length === 0
|
||||
);
|
||||
});
|
||||
|
||||
test('should not load instance groups', async () => {
|
||||
OrganizationsAPI.readInstanceGroups.mockResolvedValue({
|
||||
data: {
|
||||
results: [],
|
||||
},
|
||||
});
|
||||
|
||||
let wrapper;
|
||||
await act(async () => {
|
||||
wrapper = mountWithContexts(
|
||||
<OrganizationDetail organization={mockOrganization} />
|
||||
);
|
||||
});
|
||||
wrapper.update();
|
||||
const instance_groups_detail = wrapper
|
||||
.find(`Detail[label="Instance Groups"]`)
|
||||
.at(0);
|
||||
expect(instance_groups_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
|
||||
test('should not load galaxy credentials', async () => {
|
||||
OrganizationsAPI.readInstanceGroups.mockResolvedValue({ data: {} });
|
||||
let wrapper;
|
||||
await act(async () => {
|
||||
wrapper = mountWithContexts(
|
||||
<OrganizationDetail
|
||||
organization={{
|
||||
...mockOrganization,
|
||||
credential: [],
|
||||
}}
|
||||
/>
|
||||
);
|
||||
});
|
||||
wrapper.update();
|
||||
const galaxy_credentials_detail = wrapper
|
||||
.find(`Detail[label="Galaxy Credentials"]`)
|
||||
.at(0);
|
||||
expect(galaxy_credentials_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
});
|
||||
|
||||
@@ -35,6 +35,13 @@ function SubscriptionDetail() {
|
||||
},
|
||||
];
|
||||
|
||||
const { automated_instances: automatedInstancesCount, automated_since } =
|
||||
license_info;
|
||||
|
||||
const automatedInstancesSinceDateTime = automated_since
|
||||
? formatDateString(new Date(automated_since * 1000).toISOString())
|
||||
: null;
|
||||
|
||||
return (
|
||||
<>
|
||||
<RoutedTabs tabsArray={tabsArray} />
|
||||
@@ -127,19 +134,23 @@ function SubscriptionDetail() {
|
||||
label={t`Hosts imported`}
|
||||
value={license_info.current_instances}
|
||||
/>
|
||||
<Detail
|
||||
dataCy="subscription-hosts-automated"
|
||||
label={t`Hosts automated`}
|
||||
value={
|
||||
<>
|
||||
{license_info.automated_instances} <Trans>since</Trans>{' '}
|
||||
{license_info.automated_since &&
|
||||
formatDateString(
|
||||
new Date(license_info.automated_since * 1000).toISOString()
|
||||
)}
|
||||
</>
|
||||
}
|
||||
/>
|
||||
{typeof automatedInstancesCount !== 'undefined' &&
|
||||
automatedInstancesCount !== null && (
|
||||
<Detail
|
||||
dataCy="subscription-hosts-automated"
|
||||
label={t`Hosts automated`}
|
||||
value={
|
||||
automated_since ? (
|
||||
<Trans>
|
||||
{automatedInstancesCount} since{' '}
|
||||
{automatedInstancesSinceDateTime}
|
||||
</Trans>
|
||||
) : (
|
||||
automatedInstancesCount
|
||||
)
|
||||
}
|
||||
/>
|
||||
)}
|
||||
<Detail
|
||||
dataCy="subscription-hosts-remaining"
|
||||
label={t`Hosts remaining`}
|
||||
|
||||
@@ -82,4 +82,17 @@ describe('<SubscriptionDetail />', () => {
|
||||
|
||||
expect(wrapper.find('Button[aria-label="edit"]').length).toBe(1);
|
||||
});
|
||||
|
||||
test('should not render Hosts Automated Detail if license_info.automated_instances is undefined', () => {
|
||||
wrapper = mountWithContexts(<SubscriptionDetail />, {
|
||||
context: {
|
||||
config: {
|
||||
...config,
|
||||
license_info: { ...config.license_info, automated_instances: null },
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
expect(wrapper.find(`Detail[label="Hosts automated"]`).length).toBe(0);
|
||||
});
|
||||
});
|
||||
|
||||
@@ -354,7 +354,7 @@ function JobTemplateDetail({ template }) {
|
||||
helpText={helpText.enabledOptions}
|
||||
/>
|
||||
)}
|
||||
{summary_fields.credentials && summary_fields.credentials.length > 0 && (
|
||||
{summary_fields.credentials && (
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Credentials`}
|
||||
@@ -378,9 +378,10 @@ function JobTemplateDetail({ template }) {
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
isEmpty={summary_fields.credentials.length === 0}
|
||||
/>
|
||||
)}
|
||||
{summary_fields.labels && summary_fields.labels.results.length > 0 && (
|
||||
{summary_fields.labels && (
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Labels`}
|
||||
@@ -399,36 +400,36 @@ function JobTemplateDetail({ template }) {
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
isEmpty={summary_fields.labels.results.length === 0}
|
||||
/>
|
||||
)}
|
||||
{instanceGroups.length > 0 && (
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Instance Groups`}
|
||||
dataCy="jt-detail-instance-groups"
|
||||
helpText={helpText.instanceGroups}
|
||||
value={
|
||||
<ChipGroup
|
||||
numChips={5}
|
||||
totalChips={instanceGroups.length}
|
||||
ouiaId="instance-group-chips"
|
||||
>
|
||||
{instanceGroups.map((ig) => (
|
||||
<Link to={`${buildLinkURL(ig)}${ig.id}/details`} key={ig.id}>
|
||||
<Chip
|
||||
key={ig.id}
|
||||
ouiaId={`instance-group-${ig.id}-chip`}
|
||||
isReadOnly
|
||||
>
|
||||
{ig.name}
|
||||
</Chip>
|
||||
</Link>
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
/>
|
||||
)}
|
||||
{job_tags && job_tags.length > 0 && (
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Instance Groups`}
|
||||
dataCy="jt-detail-instance-groups"
|
||||
helpText={helpText.instanceGroups}
|
||||
value={
|
||||
<ChipGroup
|
||||
numChips={5}
|
||||
totalChips={instanceGroups.length}
|
||||
ouiaId="instance-group-chips"
|
||||
>
|
||||
{instanceGroups.map((ig) => (
|
||||
<Link to={`${buildLinkURL(ig)}${ig.id}/details`} key={ig.id}>
|
||||
<Chip
|
||||
key={ig.id}
|
||||
ouiaId={`instance-group-${ig.id}-chip`}
|
||||
isReadOnly
|
||||
>
|
||||
{ig.name}
|
||||
</Chip>
|
||||
</Link>
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
isEmpty={instanceGroups.length === 0}
|
||||
/>
|
||||
{job_tags && (
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Job Tags`}
|
||||
@@ -451,9 +452,10 @@ function JobTemplateDetail({ template }) {
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
isEmpty={job_tags.length === 0}
|
||||
/>
|
||||
)}
|
||||
{skip_tags && skip_tags.length > 0 && (
|
||||
{skip_tags && (
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Skip Tags`}
|
||||
@@ -476,6 +478,7 @@ function JobTemplateDetail({ template }) {
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
isEmpty={skip_tags.length === 0}
|
||||
/>
|
||||
)}
|
||||
<VariablesDetail
|
||||
|
||||
@@ -195,4 +195,94 @@ describe('<JobTemplateDetail />', () => {
|
||||
wrapper.find(`Detail[label="Execution Environment"] dd`).text()
|
||||
).toBe('Default EE');
|
||||
});
|
||||
|
||||
test('should not load credentials', async () => {
|
||||
await act(async () => {
|
||||
wrapper = mountWithContexts(
|
||||
<JobTemplateDetail
|
||||
template={{
|
||||
...mockTemplate,
|
||||
allow_simultaneous: true,
|
||||
ask_inventory_on_launch: true,
|
||||
summary_fields: {
|
||||
credentials: [],
|
||||
},
|
||||
}}
|
||||
/>
|
||||
);
|
||||
});
|
||||
const credentials_detail = wrapper
|
||||
.find(`Detail[label="Credentials"]`)
|
||||
.at(0);
|
||||
expect(credentials_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
|
||||
test('should not load labels', async () => {
|
||||
await act(async () => {
|
||||
wrapper = mountWithContexts(
|
||||
<JobTemplateDetail
|
||||
template={{
|
||||
...mockTemplate,
|
||||
allow_simultaneous: true,
|
||||
ask_inventory_on_launch: true,
|
||||
summary_fields: {
|
||||
labels: {
|
||||
results: [],
|
||||
},
|
||||
},
|
||||
}}
|
||||
/>
|
||||
);
|
||||
});
|
||||
const labels_detail = wrapper.find(`Detail[label="Labels"]`).at(0);
|
||||
expect(labels_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
|
||||
test('should not load instance groups', async () => {
|
||||
JobTemplatesAPI.readInstanceGroups.mockResolvedValue({
|
||||
data: {
|
||||
results: [],
|
||||
},
|
||||
});
|
||||
|
||||
let wrapper;
|
||||
await act(async () => {
|
||||
wrapper = mountWithContexts(
|
||||
<JobTemplateDetail template={mockTemplate} />
|
||||
);
|
||||
});
|
||||
wrapper.update();
|
||||
const instance_groups_detail = wrapper
|
||||
.find(`Detail[label="Instance Groups"]`)
|
||||
.at(0);
|
||||
expect(instance_groups_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
|
||||
test('should not load job tags', async () => {
|
||||
await act(async () => {
|
||||
wrapper = mountWithContexts(
|
||||
<JobTemplateDetail
|
||||
template={{
|
||||
...mockTemplate,
|
||||
job_tags: '',
|
||||
}}
|
||||
/>
|
||||
);
|
||||
});
|
||||
expect(wrapper.find('Detail[label="Job Tags"]').length).toBe(0);
|
||||
});
|
||||
|
||||
test('should not load skip tags', async () => {
|
||||
await act(async () => {
|
||||
wrapper = mountWithContexts(
|
||||
<JobTemplateDetail
|
||||
template={{
|
||||
...mockTemplate,
|
||||
skip_tags: '',
|
||||
}}
|
||||
/>
|
||||
);
|
||||
});
|
||||
expect(wrapper.find('Detail[label="Skip Tags"]').length).toBe(0);
|
||||
});
|
||||
});
|
||||
|
||||
@@ -110,12 +110,11 @@ function WorkflowJobTemplateDetail({ template }) {
|
||||
<DetailList gutter="sm">
|
||||
<Detail label={t`Name`} value={name} dataCy="jt-detail-name" />
|
||||
<Detail label={t`Description`} value={description} />
|
||||
{summary_fields.recent_jobs?.length > 0 && (
|
||||
<Detail
|
||||
value={<Sparkline jobs={recentPlaybookJobs} />}
|
||||
label={t`Activity`}
|
||||
/>
|
||||
)}
|
||||
<Detail
|
||||
value={<Sparkline jobs={recentPlaybookJobs} />}
|
||||
label={t`Activity`}
|
||||
isEmpty={summary_fields.recent_jobs?.length === 0}
|
||||
/>
|
||||
{summary_fields.organization && (
|
||||
<Detail
|
||||
label={t`Organization`}
|
||||
@@ -202,26 +201,25 @@ function WorkflowJobTemplateDetail({ template }) {
|
||||
helpText={helpText.enabledOptions}
|
||||
/>
|
||||
)}
|
||||
{summary_fields.labels?.results?.length > 0 && (
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Labels`}
|
||||
helpText={helpText.labels}
|
||||
value={
|
||||
<ChipGroup
|
||||
numChips={3}
|
||||
totalChips={summary_fields.labels.results.length}
|
||||
ouiaId="workflow-job-template-detail-label-chips"
|
||||
>
|
||||
{summary_fields.labels.results.map((l) => (
|
||||
<Chip key={l.id} ouiaId={`${l.name}-label-chip`} isReadOnly>
|
||||
{l.name}
|
||||
</Chip>
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
/>
|
||||
)}
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Labels`}
|
||||
helpText={helpText.labels}
|
||||
value={
|
||||
<ChipGroup
|
||||
numChips={3}
|
||||
totalChips={summary_fields.labels.results.length}
|
||||
ouiaId="workflow-job-template-detail-label-chips"
|
||||
>
|
||||
{summary_fields.labels.results.map((l) => (
|
||||
<Chip key={l.id} ouiaId={`${l.name}-label-chip`} isReadOnly>
|
||||
{l.name}
|
||||
</Chip>
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
isEmpty={!summary_fields.labels?.results?.length}
|
||||
/>
|
||||
<VariablesDetail
|
||||
dataCy="workflow-job-template-detail-extra-vars"
|
||||
helpText={helpText.variables}
|
||||
|
||||
@@ -178,4 +178,46 @@ describe('<WorkflowJobTemplateDetail/>', () => {
|
||||
expect(inventory.prop('to')).toEqual('/inventories/inventory/1/details');
|
||||
expect(organization.prop('to')).toEqual('/organizations/1/details');
|
||||
});
|
||||
|
||||
test('should not load Activity', async () => {
|
||||
await act(async () => {
|
||||
wrapper = mountWithContexts(
|
||||
<WorkflowJobTemplateDetail
|
||||
template={{
|
||||
...template,
|
||||
summary_fields: {
|
||||
...template.summary_fields,
|
||||
recent_jobs: [],
|
||||
},
|
||||
}}
|
||||
hasContentLoading={false}
|
||||
onSetContentLoading={() => {}}
|
||||
/>
|
||||
);
|
||||
});
|
||||
const activity_detail = wrapper.find(`Detail[label="Activity"]`).at(0);
|
||||
expect(activity_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
|
||||
test('should not load Labels', async () => {
|
||||
await act(async () => {
|
||||
wrapper = mountWithContexts(
|
||||
<WorkflowJobTemplateDetail
|
||||
template={{
|
||||
...template,
|
||||
summary_fields: {
|
||||
...template.summary_fields,
|
||||
labels: {
|
||||
results: [],
|
||||
},
|
||||
},
|
||||
}}
|
||||
hasContentLoading={false}
|
||||
onSetContentLoading={() => {}}
|
||||
/>
|
||||
);
|
||||
});
|
||||
const labels_detail = wrapper.find(`Detail[label="Labels"]`).at(0);
|
||||
expect(labels_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
});
|
||||
|
||||
@@ -44,30 +44,28 @@ function TemplatePopoverContent({ template }) {
|
||||
value={template?.playbook}
|
||||
dataCy={`template-${template.id}-playbook`}
|
||||
/>
|
||||
{template.summary_fields?.credentials &&
|
||||
template.summary_fields.credentials.length ? (
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Credentials`}
|
||||
dataCy={`template-${template.id}-credentials`}
|
||||
value={
|
||||
<ChipGroup
|
||||
numChips={5}
|
||||
totalChips={template.summary_fields.credentials.length}
|
||||
ouiaId={`template-${template.id}-credential-chips`}
|
||||
>
|
||||
{template.summary_fields.credentials.map((c) => (
|
||||
<CredentialChip
|
||||
key={c.id}
|
||||
credential={c}
|
||||
isReadOnly
|
||||
ouiaId={`credential-${c.id}-chip`}
|
||||
/>
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
/>
|
||||
) : null}
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Credentials`}
|
||||
dataCy={`template-${template.id}-credentials`}
|
||||
value={
|
||||
<ChipGroup
|
||||
numChips={5}
|
||||
totalChips={template.summary_fields?.credentials?.length}
|
||||
ouiaId={`template-${template.id}-credential-chips`}
|
||||
>
|
||||
{template.summary_fields?.credentials?.map((c) => (
|
||||
<CredentialChip
|
||||
key={c.id}
|
||||
credential={c}
|
||||
isReadOnly
|
||||
ouiaId={`credential-${c.id}-chip`}
|
||||
/>
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
isEmpty={template.summary_fields?.credentials?.length === 0}
|
||||
/>
|
||||
</DetailList>
|
||||
);
|
||||
}
|
||||
|
||||
@@ -309,25 +309,24 @@ function WorkflowApprovalDetail({ workflowApproval }) {
|
||||
dataCy="wa-detail-inventory"
|
||||
/>
|
||||
) : null}
|
||||
{workflowJob?.summary_fields?.labels?.results?.length > 0 && (
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Labels`}
|
||||
value={
|
||||
<ChipGroup
|
||||
numChips={5}
|
||||
totalChips={workflowJob.summary_fields.labels.results.length}
|
||||
ouiaId="wa-detail-label-chips"
|
||||
>
|
||||
{workflowJob.summary_fields.labels.results.map((label) => (
|
||||
<Chip key={label.id} isReadOnly>
|
||||
{label.name}
|
||||
</Chip>
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
/>
|
||||
)}
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Labels`}
|
||||
value={
|
||||
<ChipGroup
|
||||
numChips={5}
|
||||
totalChips={workflowJob.summary_fields.labels.results.length}
|
||||
ouiaId="wa-detail-label-chips"
|
||||
>
|
||||
{workflowJob.summary_fields.labels.results.map((label) => (
|
||||
<Chip key={label.id} isReadOnly>
|
||||
{label.name}
|
||||
</Chip>
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
isEmpty={!workflowJob?.summary_fields?.labels?.results?.length}
|
||||
/>
|
||||
{workflowJob?.extra_vars ? (
|
||||
<VariablesDetail
|
||||
dataCy="wa-detail-variables"
|
||||
|
||||
@@ -482,6 +482,33 @@ describe('<WorkflowApprovalDetail />', () => {
|
||||
expect(wrapper.find('DeleteButton').length).toBe(1);
|
||||
});
|
||||
|
||||
test('should not load Labels', async () => {
|
||||
WorkflowJobTemplatesAPI.readDetail.mockResolvedValue({
|
||||
data: workflowJobTemplate,
|
||||
});
|
||||
WorkflowJobsAPI.readDetail.mockResolvedValue({
|
||||
data: {
|
||||
...workflowApproval,
|
||||
summary_fields: {
|
||||
...workflowApproval.summary_fields,
|
||||
labels: {
|
||||
results: [],
|
||||
},
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
let wrapper;
|
||||
await act(async () => {
|
||||
wrapper = mountWithContexts(
|
||||
<WorkflowApprovalDetail workflowApproval={workflowApproval} />
|
||||
);
|
||||
});
|
||||
waitForElement(wrapper, 'WorkflowApprovalDetail', (el) => el.length > 0);
|
||||
const labels_detail = wrapper.find(`Detail[label="Labels"]`).at(0);
|
||||
expect(labels_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
|
||||
test('Error dialog shown for failed approval', async () => {
|
||||
WorkflowApprovalsAPI.approve.mockImplementationOnce(() =>
|
||||
Promise.reject(new Error())
|
||||
|
||||
@@ -1,9 +1,9 @@
|
||||
/* eslint-disable no-undef */
|
||||
importScripts('https://d3js.org/d3-collection.v1.min.js');
|
||||
importScripts('https://d3js.org/d3-dispatch.v1.min.js');
|
||||
importScripts('https://d3js.org/d3-quadtree.v1.min.js');
|
||||
importScripts('https://d3js.org/d3-timer.v1.min.js');
|
||||
importScripts('https://d3js.org/d3-force.v1.min.js');
|
||||
importScripts('d3-collection.v1.min.js');
|
||||
importScripts('d3-dispatch.v1.min.js');
|
||||
importScripts('d3-quadtree.v1.min.js');
|
||||
importScripts('d3-timer.v1.min.js');
|
||||
importScripts('d3-force.v1.min.js');
|
||||
|
||||
onmessage = function calculateLayout({ data: { nodes, links } }) {
|
||||
const simulation = d3
|
||||
|
||||
@@ -33,6 +33,7 @@ action_groups:
|
||||
- role
|
||||
- schedule
|
||||
- settings
|
||||
- subscriptions
|
||||
- team
|
||||
- token
|
||||
- user
|
||||
|
||||
@@ -23,7 +23,7 @@ options:
|
||||
manifest:
|
||||
description:
|
||||
- file path to a Red Hat subscription manifest (a .zip file)
|
||||
required: True
|
||||
required: False
|
||||
type: str
|
||||
force:
|
||||
description:
|
||||
@@ -31,6 +31,11 @@ options:
|
||||
unlicensed or trial licensed. When force=true, the license is always applied.
|
||||
type: bool
|
||||
default: 'False'
|
||||
pool_id:
|
||||
description:
|
||||
- Red Hat or Red Hat Satellite pool_id to attach to
|
||||
required: False
|
||||
type: str
|
||||
state:
|
||||
description:
|
||||
- Desired state of the resource.
|
||||
@@ -47,6 +52,10 @@ EXAMPLES = '''
|
||||
license:
|
||||
manifest: "/tmp/my_manifest.zip"
|
||||
|
||||
- name: Attach to a pool
|
||||
license:
|
||||
pool_id: 123456
|
||||
|
||||
- name: Remove license
|
||||
license:
|
||||
state: absent
|
||||
@@ -61,12 +70,14 @@ def main():
|
||||
module = ControllerAPIModule(
|
||||
argument_spec=dict(
|
||||
manifest=dict(type='str', required=False),
|
||||
pool_id=dict(type='str', required=False),
|
||||
force=dict(type='bool', default=False),
|
||||
state=dict(choices=['present', 'absent'], default='present'),
|
||||
),
|
||||
required_if=[
|
||||
['state', 'present', ['manifest']],
|
||||
['state', 'present', ['manifest', 'pool_id'], True],
|
||||
],
|
||||
mutually_exclusive=[("manifest", "pool_id")],
|
||||
)
|
||||
|
||||
json_output = {'changed': False}
|
||||
@@ -77,11 +88,12 @@ def main():
|
||||
module.delete_endpoint('config')
|
||||
module.exit_json(**json_output)
|
||||
|
||||
try:
|
||||
with open(module.params.get('manifest'), 'rb') as fid:
|
||||
manifest = base64.b64encode(fid.read())
|
||||
except OSError as e:
|
||||
module.fail_json(msg=str(e))
|
||||
if module.params.get('manifest', None):
|
||||
try:
|
||||
with open(module.params.get('manifest'), 'rb') as fid:
|
||||
manifest = base64.b64encode(fid.read())
|
||||
except OSError as e:
|
||||
module.fail_json(msg=str(e))
|
||||
|
||||
# Check if Tower is already licensed
|
||||
config = module.get_endpoint('config')['json']
|
||||
@@ -104,7 +116,10 @@ def main():
|
||||
# Do the actual install, if we need to
|
||||
if perform_install:
|
||||
json_output['changed'] = True
|
||||
module.post_endpoint('config', data={'manifest': manifest.decode()})
|
||||
if module.params.get('manifest', None):
|
||||
module.post_endpoint('config', data={'manifest': manifest.decode()})
|
||||
else:
|
||||
module.post_endpoint('config/attach', data={'pool_id': module.params.get('pool_id')})
|
||||
|
||||
module.exit_json(**json_output)
|
||||
|
||||
|
||||
101
awx_collection/plugins/modules/subscriptions.py
Normal file
101
awx_collection/plugins/modules/subscriptions.py
Normal file
@@ -0,0 +1,101 @@
|
||||
#!/usr/bin/python
|
||||
# coding: utf-8 -*-
|
||||
|
||||
# (c) 2019, John Westcott IV <john.westcott.iv@redhat.com>
|
||||
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
|
||||
|
||||
from __future__ import absolute_import, division, print_function
|
||||
|
||||
__metaclass__ = type
|
||||
|
||||
ANSIBLE_METADATA = {'metadata_version': '1.1', 'status': ['preview'], 'supported_by': 'community'}
|
||||
|
||||
|
||||
DOCUMENTATION = '''
|
||||
---
|
||||
module: subscriptions
|
||||
author: "John Westcott IV (@john-westcott-iv)"
|
||||
short_description: Get subscription list
|
||||
description:
|
||||
- Get subscriptions available to Automation Platform Controller. See
|
||||
U(https://www.ansible.com/tower) for an overview.
|
||||
options:
|
||||
username:
|
||||
description:
|
||||
- Red Hat or Red Hat Satellite username to get available subscriptions.
|
||||
- The credentials you use will be stored for future use in retrieving renewal or expanded subscriptions
|
||||
required: True
|
||||
type: str
|
||||
password:
|
||||
description:
|
||||
- Red Hat or Red Hat Satellite password to get available subscriptions.
|
||||
- The credentials you use will be stored for future use in retrieving renewal or expanded subscriptions
|
||||
required: True
|
||||
type: str
|
||||
filters:
|
||||
description:
|
||||
- Client side filters to apply to the subscriptions.
|
||||
- For any entries in this dict, if there is a corresponding entry in the subscription it must contain the value from this dict
|
||||
- Note This is a client side search, not an API side search
|
||||
required: False
|
||||
type: dict
|
||||
extends_documentation_fragment: awx.awx.auth
|
||||
'''
|
||||
|
||||
RETURN = '''
|
||||
subscriptions:
|
||||
description: dictionary containing information about the subscriptions
|
||||
returned: If login succeeded
|
||||
type: dict
|
||||
'''
|
||||
|
||||
EXAMPLES = '''
|
||||
- name: Get subscriptions
|
||||
subscriptions:
|
||||
username: "my_username"
|
||||
password: "My Password"
|
||||
|
||||
- name: Get subscriptions with a filter
|
||||
subscriptions:
|
||||
username: "my_username"
|
||||
password: "My Password"
|
||||
filters:
|
||||
product_name: "Red Hat Ansible Automation Platform"
|
||||
support_level: "Self-Support"
|
||||
'''
|
||||
|
||||
from ..module_utils.controller_api import ControllerAPIModule
|
||||
|
||||
|
||||
def main():
|
||||
|
||||
module = ControllerAPIModule(
|
||||
argument_spec=dict(
|
||||
username=dict(type='str', required=True),
|
||||
password=dict(type='str', no_log=True, required=True),
|
||||
filters=dict(type='dict', required=False, default={}),
|
||||
),
|
||||
)
|
||||
|
||||
json_output = {'changed': False}
|
||||
|
||||
# Check if Tower is already licensed
|
||||
post_data = {
|
||||
'subscriptions_password': module.params.get('password'),
|
||||
'subscriptions_username': module.params.get('username'),
|
||||
}
|
||||
all_subscriptions = module.post_endpoint('config/subscriptions', data=post_data)['json']
|
||||
json_output['subscriptions'] = []
|
||||
for subscription in all_subscriptions:
|
||||
add = True
|
||||
for key in module.params.get('filters').keys():
|
||||
if subscription.get(key, None) and module.params.get('filters')[key] not in subscription.get(key):
|
||||
add = False
|
||||
if add:
|
||||
json_output['subscriptions'].append(subscription)
|
||||
|
||||
module.exit_json(**json_output)
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
main()
|
||||
@@ -41,6 +41,7 @@ no_endpoint_for_module = [
|
||||
'workflow_template',
|
||||
'ad_hoc_command_wait',
|
||||
'ad_hoc_command_cancel',
|
||||
'subscriptions', # Subscription deals with config/subscriptions
|
||||
]
|
||||
|
||||
# Global module parameters we can ignore
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user