mirror of
https://github.com/ansible/awx.git
synced 2026-03-30 23:35:05 -02:30
Compare commits
1 Commits
21.4.0
...
revert-124
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
4ba8dd4d98 |
2
.github/ISSUE_TEMPLATE.md
vendored
2
.github/ISSUE_TEMPLATE.md
vendored
@@ -25,7 +25,7 @@ Instead use the bug or feature request.
|
||||
<!--- Pick one below and delete the rest: -->
|
||||
- Breaking Change
|
||||
- New or Enhanced Feature
|
||||
- Bug, Docs Fix or other nominal change
|
||||
- Bug or Docs Fix
|
||||
|
||||
|
||||
##### COMPONENT NAME
|
||||
|
||||
2
.github/PULL_REQUEST_TEMPLATE.md
vendored
2
.github/PULL_REQUEST_TEMPLATE.md
vendored
@@ -11,7 +11,7 @@ the change does.
|
||||
<!--- Pick one below and delete the rest: -->
|
||||
- Breaking Change
|
||||
- New or Enhanced Feature
|
||||
- Bug, Docs Fix or other nominal change
|
||||
- Bug or Docs Fix
|
||||
|
||||
##### COMPONENT NAME
|
||||
<!--- Name of the module/plugin/module/task -->
|
||||
|
||||
3
.github/triage_replies.md
vendored
3
.github/triage_replies.md
vendored
@@ -93,9 +93,6 @@ The Ansible Community is looking at building an EE that corresponds to all of th
|
||||
- AWX: https://github.com/ansible/awx/blob/devel/CONTRIBUTING.md
|
||||
- AWX-Operator: https://github.com/ansible/awx-operator/blob/devel/CONTRIBUTING.md
|
||||
|
||||
### Oracle AWX
|
||||
We'd be happy to help if you can reproduce this with AWX since we do not have Oracle's Linux Automation Manager. If you need help with this specific version of Oracles Linux Automation Manager you will need to contact your Oracle for support.
|
||||
|
||||
### AWX Release
|
||||
Subject: Announcing AWX Xa.Ya.za and AWX-Operator Xb.Yb.zb
|
||||
|
||||
|
||||
11
.github/workflows/ci.yml
vendored
11
.github/workflows/ci.yml
vendored
@@ -111,18 +111,9 @@ jobs:
|
||||
repository: ansible/awx-operator
|
||||
path: awx-operator
|
||||
|
||||
- name: Get python version from Makefile
|
||||
working-directory: awx
|
||||
run: echo py_version=`make PYTHON_VERSION` >> $GITHUB_ENV
|
||||
|
||||
- name: Install python ${{ env.py_version }}
|
||||
uses: actions/setup-python@v2
|
||||
with:
|
||||
python-version: ${{ env.py_version }}
|
||||
|
||||
- name: Install playbook dependencies
|
||||
run: |
|
||||
python3 -m pip install docker
|
||||
python3 -m pip install docker setuptools_scm
|
||||
|
||||
- name: Build AWX image
|
||||
working-directory: awx
|
||||
|
||||
45
.github/workflows/pr_body_check.yml
vendored
45
.github/workflows/pr_body_check.yml
vendored
@@ -1,45 +0,0 @@
|
||||
---
|
||||
name: PR Check
|
||||
env:
|
||||
BRANCH: ${{ github.base_ref || 'devel' }}
|
||||
on:
|
||||
pull_request:
|
||||
types: [opened, edited, reopened, synchronize]
|
||||
jobs:
|
||||
pr-check:
|
||||
name: Scan PR description for semantic versioning keywords
|
||||
runs-on: ubuntu-latest
|
||||
permissions:
|
||||
packages: write
|
||||
contents: read
|
||||
steps:
|
||||
- name: Write PR body to a file
|
||||
run: |
|
||||
cat >> pr.body << __SOME_RANDOM_PR_EOF__
|
||||
${{ github.event.pull_request.body }}
|
||||
__SOME_RANDOM_PR_EOF__
|
||||
|
||||
- name: Display the received body for troubleshooting
|
||||
run: cat pr.body
|
||||
|
||||
# We want to write these out individually just incase the options were joined on a single line
|
||||
- name: Check for each of the lines
|
||||
run: |
|
||||
grep "Bug, Docs Fix or other nominal change" pr.body > Z
|
||||
grep "New or Enhanced Feature" pr.body > Y
|
||||
grep "Breaking Change" pr.body > X
|
||||
exit 0
|
||||
# We exit 0 and set the shell to prevent the returns from the greps from failing this step
|
||||
# See https://docs.github.com/en/actions/using-workflows/workflow-syntax-for-github-actions#exit-codes-and-error-action-preference
|
||||
shell: bash {0}
|
||||
|
||||
- name: Check for exactly one item
|
||||
run: |
|
||||
if [ $(cat X Y Z | wc -l) != 1 ] ; then
|
||||
echo "The PR body must contain exactly one of [ 'Bug, Docs Fix or other nominal change', 'New or Enhanced Feature', 'Breaking Change' ]"
|
||||
echo "We counted $(cat X Y Z | wc -l)"
|
||||
echo "See the default PR body for examples"
|
||||
exit 255;
|
||||
else
|
||||
exit 0;
|
||||
fi
|
||||
2
.github/workflows/stage.yml
vendored
2
.github/workflows/stage.yml
vendored
@@ -65,7 +65,7 @@ jobs:
|
||||
|
||||
- name: Install playbook dependencies
|
||||
run: |
|
||||
python3 -m pip install docker
|
||||
python3 -m pip install docker setuptools_scm
|
||||
|
||||
- name: Build and stage AWX
|
||||
working-directory: awx
|
||||
|
||||
@@ -19,17 +19,16 @@ Have questions about this document or anything not covered here? Come chat with
|
||||
- [Purging containers and images](#purging-containers-and-images)
|
||||
- [Pre commit hooks](#pre-commit-hooks)
|
||||
- [What should I work on?](#what-should-i-work-on)
|
||||
- [Translations](#translations)
|
||||
- [Submitting Pull Requests](#submitting-pull-requests)
|
||||
- [PR Checks run by Zuul](#pr-checks-run-by-zuul)
|
||||
- [Reporting Issues](#reporting-issues)
|
||||
- [Getting Help](#getting-help)
|
||||
|
||||
## Things to know prior to submitting code
|
||||
|
||||
- All code submissions are done through pull requests against the `devel` branch.
|
||||
- You must use `git commit --signoff` for any commit to be merged, and agree that usage of --signoff constitutes agreement with the terms of [DCO 1.1](./DCO_1_1.md).
|
||||
- Take care to make sure no merge commits are in the submission, and use `git rebase` vs `git merge` for this reason.
|
||||
- If collaborating with someone else on the same branch, consider using `--force-with-lease` instead of `--force`. This will prevent you from accidentally overwriting commits pushed by someone else. For more information, see [git push docs](https://git-scm.com/docs/git-push#git-push---force-with-leaseltrefnamegt).
|
||||
- If collaborating with someone else on the same branch, consider using `--force-with-lease` instead of `--force`. This will prevent you from accidentally overwriting commits pushed by someone else. For more information, see https://git-scm.com/docs/git-push#git-push---force-with-leaseltrefnamegt
|
||||
- If submitting a large code change, it's a good idea to join the `#ansible-awx` channel on irc.libera.chat, and talk about what you would like to do or add first. This not only helps everyone know what's going on, it also helps save time and effort, if the community decides some changes are needed.
|
||||
- We ask all of our community members and contributors to adhere to the [Ansible code of conduct](http://docs.ansible.com/ansible/latest/community/code_of_conduct.html). If you have questions, or need assistance, please reach out to our community team at [codeofconduct@ansible.com](mailto:codeofconduct@ansible.com)
|
||||
|
||||
@@ -43,7 +42,8 @@ The AWX development environment workflow and toolchain uses Docker and the docke
|
||||
|
||||
Prior to starting the development services, you'll need `docker` and `docker-compose`. On Linux, you can generally find these in your distro's packaging, but you may find that Docker themselves maintain a separate repo that tracks more closely to the latest releases.
|
||||
|
||||
For macOS and Windows, we recommend [Docker for Mac](https://www.docker.com/docker-mac) and [Docker for Windows](https://www.docker.com/docker-windows) respectively.
|
||||
For macOS and Windows, we recommend [Docker for Mac](https://www.docker.com/docker-mac) and [Docker for Windows](https://www.docker.com/docker-windows)
|
||||
respectively.
|
||||
|
||||
For Linux platforms, refer to the following from Docker:
|
||||
|
||||
@@ -79,13 +79,17 @@ See the [README.md](./tools/docker-compose/README.md) for docs on how to build t
|
||||
|
||||
### Building API Documentation
|
||||
|
||||
AWX includes support for building [Swagger/OpenAPI documentation](https://swagger.io). To build the documentation locally, run:
|
||||
AWX includes support for building [Swagger/OpenAPI
|
||||
documentation](https://swagger.io). To build the documentation locally, run:
|
||||
|
||||
```bash
|
||||
(container)/awx_devel$ make swagger
|
||||
```
|
||||
|
||||
This will write a file named `swagger.json` that contains the API specification in OpenAPI format. A variety of online tools are available for translating this data into more consumable formats (such as HTML). http://editor.swagger.io is an example of one such service.
|
||||
This will write a file named `swagger.json` that contains the API specification
|
||||
in OpenAPI format. A variety of online tools are available for translating
|
||||
this data into more consumable formats (such as HTML). http://editor.swagger.io
|
||||
is an example of one such service.
|
||||
|
||||
### Accessing the AWX web interface
|
||||
|
||||
@@ -111,30 +115,20 @@ While you can use environment variables to skip the pre-commit hooks GitHub will
|
||||
|
||||
## What should I work on?
|
||||
|
||||
We have a ["good first issue" label](https://github.com/ansible/awx/issues?q=is%3Aissue+is%3Aopen+label%3A%22good+first+issue%22) we put on some issues that might be a good starting point for new contributors.
|
||||
|
||||
Fixing bugs and updating the documentation are always appreciated, so reviewing the backlog of issues is always a good place to start.
|
||||
|
||||
For feature work, take a look at the current [Enhancements](https://github.com/ansible/awx/issues?q=is%3Aissue+is%3Aopen+label%3Atype%3Aenhancement).
|
||||
|
||||
If it has someone assigned to it then that person is the person responsible for working the enhancement. If you feel like you could contribute then reach out to that person.
|
||||
|
||||
**NOTES**
|
||||
|
||||
> Issue assignment will only be done for maintainers of the project. If you decide to work on an issue, please feel free to add a comment in the issue to let others know that you are working on it; but know that we will accept the first pull request from whomever is able to fix an issue. Once your PR is accepted we can add you as an assignee to an issue upon request.
|
||||
Fixing bugs, adding translations, and updating the documentation are always appreciated, so reviewing the backlog of issues is always a good place to start. For extra information on debugging tools, see [Debugging](./docs/debugging/).
|
||||
|
||||
**NOTE**
|
||||
|
||||
> If you work in a part of the codebase that is going through active development, your changes may be rejected, or you may be asked to `rebase`. A good idea before starting work is to have a discussion with us in the `#ansible-awx` channel on irc.libera.chat, or on the [mailing list](https://groups.google.com/forum/#!forum/awx-project).
|
||||
|
||||
**NOTE**
|
||||
|
||||
> If you're planning to develop features or fixes for the UI, please review the [UI Developer doc](./awx/ui/README.md).
|
||||
|
||||
### Translations
|
||||
|
||||
At this time we do not accept PRs for adding additional language translations as we have an automated process for generating our translations. This is because translations require constant care as new strings are added and changed in the code base. Because of this the .po files are overwritten during every translation release cycle. We also can't support a lot of translations on AWX as its an open source project and each language adds time and cost to maintain. If you would like to see AWX translated into a new language please create an issue and ask others you know to upvote the issue. Our translation team will review the needs of the community and see what they can do around supporting additional language.
|
||||
|
||||
If you find an issue with an existing translation, please see the [Reporting Issues](#reporting-issues) section to open an issue and our translation team will work with you on a resolution.
|
||||
|
||||
|
||||
## Submitting Pull Requests
|
||||
|
||||
Fixes and Features for AWX will go through the Github pull request process. Submit your pull request (PR) against the `devel` branch.
|
||||
@@ -158,14 +152,28 @@ We like to keep our commit history clean, and will require resubmission of pull
|
||||
|
||||
Sometimes it might take us a while to fully review your PR. We try to keep the `devel` branch in good working order, and so we review requests carefully. Please be patient.
|
||||
|
||||
When your PR is initially submitted the checks will not be run until a maintainer allows them to be. Once a maintainer has done a quick review of your work the PR will have the linter and unit tests run against them via GitHub Actions, and the status reported in the PR.
|
||||
All submitted PRs will have the linter and unit tests run against them via Zuul, and the status reported in the PR.
|
||||
|
||||
## PR Checks run by Zuul
|
||||
|
||||
Zuul jobs for awx are defined in the [zuul-jobs](https://github.com/ansible/zuul-jobs) repo.
|
||||
|
||||
Zuul runs the following checks that must pass:
|
||||
|
||||
1. `tox-awx-api-lint`
|
||||
2. `tox-awx-ui-lint`
|
||||
3. `tox-awx-api`
|
||||
4. `tox-awx-ui`
|
||||
5. `tox-awx-swagger`
|
||||
|
||||
Zuul runs the following checks that are non-voting (can not pass but serve to inform PR reviewers):
|
||||
|
||||
1. `tox-awx-detect-schema-change`
|
||||
This check generates the schema and diffs it against a reference copy of the `devel` version of the schema.
|
||||
Reviewers should inspect the `job-output.txt.gz` related to the check if their is a failure (grep for `diff -u -b` to find beginning of diff).
|
||||
If the schema change is expected and makes sense in relation to the changes made by the PR, then you are good to go!
|
||||
If not, the schema changes should be fixed, but this decision must be enforced by reviewers.
|
||||
|
||||
## Reporting Issues
|
||||
|
||||
|
||||
We welcome your feedback, and encourage you to file an issue when you run into a problem. But before opening a new issues, we ask that you please view our [Issues guide](./ISSUES.md).
|
||||
|
||||
## Getting Help
|
||||
|
||||
If you require additional assistance, please reach out to us at `#ansible-awx` on irc.libera.chat, or submit your question to the [mailing list](https://groups.google.com/forum/#!forum/awx-project).
|
||||
|
||||
For extra information on debugging tools, see [Debugging](./docs/debugging/).
|
||||
|
||||
@@ -232,9 +232,6 @@ class FieldLookupBackend(BaseFilterBackend):
|
||||
re.compile(value)
|
||||
except re.error as e:
|
||||
raise ValueError(e.args[0])
|
||||
elif new_lookup.endswith('__iexact'):
|
||||
if not isinstance(field, (CharField, TextField)):
|
||||
raise ValueError(f'{field.name} is not a text field and cannot be filtered by case-insensitive search')
|
||||
elif new_lookup.endswith('__search'):
|
||||
related_model = getattr(field, 'related_model', None)
|
||||
if not related_model:
|
||||
@@ -261,8 +258,8 @@ class FieldLookupBackend(BaseFilterBackend):
|
||||
search_filters = {}
|
||||
needs_distinct = False
|
||||
# Can only have two values: 'AND', 'OR'
|
||||
# If 'AND' is used, an item must satisfy all conditions to show up in the results.
|
||||
# If 'OR' is used, an item just needs to satisfy one condition to appear in results.
|
||||
# If 'AND' is used, an iterm must satisfy all condition to show up in the results.
|
||||
# If 'OR' is used, an item just need to satisfy one condition to appear in results.
|
||||
search_filter_relation = 'OR'
|
||||
for key, values in request.query_params.lists():
|
||||
if key in self.RESERVED_NAMES:
|
||||
|
||||
@@ -29,6 +29,7 @@ from django.utils.translation import gettext_lazy as _
|
||||
from django.utils.encoding import force_str
|
||||
from django.utils.text import capfirst
|
||||
from django.utils.timezone import now
|
||||
from django.utils.functional import cached_property
|
||||
|
||||
# Django REST Framework
|
||||
from rest_framework.exceptions import ValidationError, PermissionDenied
|
||||
@@ -5007,7 +5008,8 @@ class ActivityStreamSerializer(BaseSerializer):
|
||||
object_association = serializers.SerializerMethodField(help_text=_("When present, shows the field name of the role or relationship that changed."))
|
||||
object_type = serializers.SerializerMethodField(help_text=_("When present, shows the model on which the role or relationship was defined."))
|
||||
|
||||
def _local_summarizable_fk_fields(self, obj):
|
||||
@cached_property
|
||||
def _local_summarizable_fk_fields(self):
|
||||
summary_dict = copy.copy(SUMMARIZABLE_FK_FIELDS)
|
||||
# Special requests
|
||||
summary_dict['group'] = summary_dict['group'] + ('inventory_id',)
|
||||
@@ -5027,13 +5029,7 @@ class ActivityStreamSerializer(BaseSerializer):
|
||||
('workflow_approval', ('id', 'name', 'unified_job_id')),
|
||||
('instance', ('id', 'hostname')),
|
||||
]
|
||||
# Optimization - do not attempt to summarize all fields, pair down to only relations that exist
|
||||
if not obj:
|
||||
return field_list
|
||||
existing_association_types = [obj.object1, obj.object2]
|
||||
if 'user' in existing_association_types:
|
||||
existing_association_types.append('role')
|
||||
return [entry for entry in field_list if entry[0] in existing_association_types]
|
||||
return field_list
|
||||
|
||||
class Meta:
|
||||
model = ActivityStream
|
||||
@@ -5117,7 +5113,7 @@ class ActivityStreamSerializer(BaseSerializer):
|
||||
data = {}
|
||||
if obj.actor is not None:
|
||||
data['actor'] = self.reverse('api:user_detail', kwargs={'pk': obj.actor.pk})
|
||||
for fk, __ in self._local_summarizable_fk_fields(obj):
|
||||
for fk, __ in self._local_summarizable_fk_fields:
|
||||
if not hasattr(obj, fk):
|
||||
continue
|
||||
m2m_list = self._get_related_objects(obj, fk)
|
||||
@@ -5174,7 +5170,7 @@ class ActivityStreamSerializer(BaseSerializer):
|
||||
|
||||
def get_summary_fields(self, obj):
|
||||
summary_fields = OrderedDict()
|
||||
for fk, related_fields in self._local_summarizable_fk_fields(obj):
|
||||
for fk, related_fields in self._local_summarizable_fk_fields:
|
||||
try:
|
||||
if not hasattr(obj, fk):
|
||||
continue
|
||||
|
||||
@@ -1440,7 +1440,7 @@ msgstr "指定した認証情報は無効 (HTTP 401) です。"
|
||||
|
||||
#: awx/api/views/root.py:193 awx/api/views/root.py:234
|
||||
msgid "Unable to connect to proxy server."
|
||||
msgstr "プロキシーサーバーに接続できません。"
|
||||
msgstr "プロキシサーバーに接続できません。"
|
||||
|
||||
#: awx/api/views/root.py:195 awx/api/views/root.py:236
|
||||
msgid "Could not connect to subscription service."
|
||||
@@ -1976,7 +1976,7 @@ msgstr "リモートホスト名または IP を判別するために検索す
|
||||
|
||||
#: awx/main/conf.py:85
|
||||
msgid "Proxy IP Allowed List"
|
||||
msgstr "プロキシー IP 許可リスト"
|
||||
msgstr "プロキシ IP 許可リスト"
|
||||
|
||||
#: awx/main/conf.py:87
|
||||
msgid ""
|
||||
@@ -2198,7 +2198,7 @@ msgid ""
|
||||
"Follow symbolic links when scanning for playbooks. Be aware that setting "
|
||||
"this to True can lead to infinite recursion if a link points to a parent "
|
||||
"directory of itself."
|
||||
msgstr "Playbook のスキャン時にシンボリックリンクをたどります。リンクが親ディレクトリーを参照している場合には、この設定を True に指定すると無限再帰が発生する可能性があります。"
|
||||
msgstr "Playbook をスキャンするときは、シンボリックリンクをたどってください。リンクがそれ自体の親ディレクトリーを指している場合は、これを True に設定すると、無限再帰が発生する可能性があることに注意してください。"
|
||||
|
||||
#: awx/main/conf.py:337
|
||||
msgid "Ignore Ansible Galaxy SSL Certificate Verification"
|
||||
@@ -2499,7 +2499,7 @@ msgstr "Insights for Ansible Automation Platform の最終収集日。"
|
||||
msgid ""
|
||||
"Last gathered entries for expensive collectors for Insights for Ansible "
|
||||
"Automation Platform."
|
||||
msgstr "Insights for Ansible Automation Platform でコストがかかっているコレクターに関して最後に収集されたエントリー"
|
||||
msgstr "Insights for Ansible Automation Platform の高価なコレクターの最後に収集されたエントリー。"
|
||||
|
||||
#: awx/main/conf.py:686
|
||||
msgid "Insights for Ansible Automation Platform Gather Interval"
|
||||
@@ -3692,7 +3692,7 @@ msgstr "タスクの開始"
|
||||
|
||||
#: awx/main/models/events.py:189
|
||||
msgid "Variables Prompted"
|
||||
msgstr "提示される変数"
|
||||
msgstr "変数のプロモート"
|
||||
|
||||
#: awx/main/models/events.py:190
|
||||
msgid "Gathering Facts"
|
||||
@@ -3741,15 +3741,15 @@ msgstr "エラー"
|
||||
|
||||
#: awx/main/models/execution_environments.py:17
|
||||
msgid "Always pull container before running."
|
||||
msgstr "実行前に必ずコンテナーをプルする"
|
||||
msgstr "実行前に必ずコンテナーをプルしてください。"
|
||||
|
||||
#: awx/main/models/execution_environments.py:18
|
||||
msgid "Only pull the image if not present before running."
|
||||
msgstr "イメージが存在しない場合のみ実行前にプルする"
|
||||
msgstr "実行する前に、存在しない場合にのみイメージをプルしてください。"
|
||||
|
||||
#: awx/main/models/execution_environments.py:19
|
||||
msgid "Never pull container before running."
|
||||
msgstr "実行前にコンテナーをプルしない"
|
||||
msgstr "実行前にコンテナーをプルしないでください。"
|
||||
|
||||
#: awx/main/models/execution_environments.py:29
|
||||
msgid ""
|
||||
@@ -5228,7 +5228,7 @@ msgid ""
|
||||
"SSL) or \"ldaps://ldap.example.com:636\" (SSL). Multiple LDAP servers may be "
|
||||
"specified by separating with spaces or commas. LDAP authentication is "
|
||||
"disabled if this parameter is empty."
|
||||
msgstr "\"ldap://ldap.example.com:389\" (非 SSL) または \"ldaps://ldap.example.com:636\" (SSL) などの LDAP サーバーに接続する URI です。複数の LDAP サーバーをスペースまたはコンマで区切って指定できます。LDAP 認証は、このパラメーターが空の場合は無効になります。"
|
||||
msgstr "\"ldap://ldap.example.com:389\" (非 SSL) または \"ldaps://ldap.example.com:636\" (SSL) などの LDAP サーバーに接続する URI です。複数の LDAP サーバーをスペースまたはカンマで区切って指定できます。LDAP 認証は、このパラメーターが空の場合は無効になります。"
|
||||
|
||||
#: awx/sso/conf.py:170 awx/sso/conf.py:187 awx/sso/conf.py:198
|
||||
#: awx/sso/conf.py:209 awx/sso/conf.py:226 awx/sso/conf.py:244
|
||||
@@ -6236,5 +6236,4 @@ msgstr "%s が現在アップグレード中です。"
|
||||
|
||||
#: awx/ui/urls.py:24
|
||||
msgid "This page will refresh when complete."
|
||||
msgstr "このページは完了すると更新されます。"
|
||||
|
||||
msgstr "このページは完了すると更新されます。"
|
||||
|
||||
@@ -956,7 +956,7 @@ msgstr "인스턴스 그룹의 인스턴스"
|
||||
|
||||
#: awx/api/views/__init__.py:450
|
||||
msgid "Schedules"
|
||||
msgstr "스케줄"
|
||||
msgstr "일정"
|
||||
|
||||
#: awx/api/views/__init__.py:464
|
||||
msgid "Schedule Recurrence Rule Preview"
|
||||
@@ -3261,7 +3261,7 @@ msgstr "JSON 또는 YAML 구문을 사용하여 인젝터를 입력합니다.
|
||||
#: awx/main/models/credential/__init__.py:412
|
||||
#, python-format
|
||||
msgid "adding %s credential type"
|
||||
msgstr "인증 정보 유형 %s 추가 중"
|
||||
msgstr "인증 정보 유형 %s 추가 중"
|
||||
|
||||
#: awx/main/models/credential/__init__.py:590
|
||||
#: awx/main/models/credential/__init__.py:672
|
||||
@@ -6236,5 +6236,4 @@ msgstr "%s 현재 업그레이드 중입니다."
|
||||
|
||||
#: awx/ui/urls.py:24
|
||||
msgid "This page will refresh when complete."
|
||||
msgstr "완료되면 이 페이지가 새로 고침됩니다."
|
||||
|
||||
msgstr "완료되면 이 페이지가 새로 고침됩니다."
|
||||
|
||||
@@ -348,7 +348,7 @@ msgstr "SCM track_submodules 只能用于 git 项目。"
|
||||
msgid ""
|
||||
"Only Container Registry credentials can be associated with an Execution "
|
||||
"Environment"
|
||||
msgstr "只有容器注册表凭证才可以与执行环境关联"
|
||||
msgstr "只有容器 registry 凭证可以与执行环境关联"
|
||||
|
||||
#: awx/api/serializers.py:1440
|
||||
msgid "Cannot change the organization of an execution environment"
|
||||
@@ -629,7 +629,7 @@ msgstr "不支持在不替换的情况下在启动时删除 {} 凭证。提供
|
||||
|
||||
#: awx/api/serializers.py:4338
|
||||
msgid "The inventory associated with this Workflow is being deleted."
|
||||
msgstr "与此工作流关联的清单将被删除。"
|
||||
msgstr "与此 Workflow 关联的清单将被删除。"
|
||||
|
||||
#: awx/api/serializers.py:4405
|
||||
msgid "Message type '{}' invalid, must be either 'message' or 'body'"
|
||||
@@ -3229,7 +3229,7 @@ msgstr "云"
|
||||
#: awx/main/models/credential/__init__.py:336
|
||||
#: awx/main/models/credential/__init__.py:1113
|
||||
msgid "Container Registry"
|
||||
msgstr "容器注册表"
|
||||
msgstr "容器 Registry"
|
||||
|
||||
#: awx/main/models/credential/__init__.py:337
|
||||
msgid "Personal Access Token"
|
||||
@@ -3560,7 +3560,7 @@ msgstr "身份验证 URL"
|
||||
|
||||
#: awx/main/models/credential/__init__.py:1120
|
||||
msgid "Authentication endpoint for the container registry."
|
||||
msgstr "容器注册表的身份验证端点。"
|
||||
msgstr "容器 registry 的身份验证端点。"
|
||||
|
||||
#: awx/main/models/credential/__init__.py:1130
|
||||
msgid "Password or Token"
|
||||
@@ -3764,7 +3764,7 @@ msgstr "镜像位置"
|
||||
msgid ""
|
||||
"The full image location, including the container registry, image name, and "
|
||||
"version tag."
|
||||
msgstr "完整镜像位置,包括容器注册表、镜像名称和版本标签。"
|
||||
msgstr "完整镜像位置,包括容器 registry、镜像名称和版本标签。"
|
||||
|
||||
#: awx/main/models/execution_environments.py:51
|
||||
msgid "Pull image before running?"
|
||||
@@ -6238,5 +6238,4 @@ msgstr "%s 当前正在升级。"
|
||||
|
||||
#: awx/ui/urls.py:24
|
||||
msgid "This page will refresh when complete."
|
||||
msgstr "完成后,此页面会刷新。"
|
||||
|
||||
msgstr "完成后,此页面会刷新。"
|
||||
|
||||
@@ -114,6 +114,13 @@ class Organization(CommonModel, NotificationFieldsModel, ResourceMixin, CustomVi
|
||||
def _get_related_jobs(self):
|
||||
return UnifiedJob.objects.non_polymorphic().filter(organization=self)
|
||||
|
||||
def create_default_galaxy_credential(self):
|
||||
from awx.main.models import Credential
|
||||
|
||||
public_galaxy_credential = Credential.objects.filter(managed=True, name='Ansible Galaxy').first()
|
||||
if public_galaxy_credential is not None and public_galaxy_credential not in self.galaxy_credentials.all():
|
||||
self.galaxy_credentials.add(public_galaxy_credential)
|
||||
|
||||
|
||||
class OrganizationGalaxyCredentialMembership(models.Model):
|
||||
|
||||
|
||||
@@ -409,7 +409,7 @@ def emit_activity_stream_change(instance):
|
||||
from awx.api.serializers import ActivityStreamSerializer
|
||||
|
||||
actor = None
|
||||
if instance.actor_id:
|
||||
if instance.actor:
|
||||
actor = instance.actor.username
|
||||
summary_fields = ActivityStreamSerializer(instance).get_summary_fields(instance)
|
||||
analytics_logger.info(
|
||||
|
||||
@@ -20,7 +20,7 @@ def test_activity_stream_related():
|
||||
"""
|
||||
serializer_related = set(
|
||||
ActivityStream._meta.get_field(field_name).related_model
|
||||
for field_name, stuff in ActivityStreamSerializer()._local_summarizable_fk_fields(None)
|
||||
for field_name, stuff in ActivityStreamSerializer()._local_summarizable_fk_fields
|
||||
if hasattr(ActivityStream, field_name)
|
||||
)
|
||||
|
||||
|
||||
@@ -79,19 +79,6 @@ def test_invalid_field():
|
||||
assert 'is not an allowed field name. Must be ascii encodable.' in str(excinfo.value)
|
||||
|
||||
|
||||
def test_valid_iexact():
|
||||
field_lookup = FieldLookupBackend()
|
||||
value, new_lookup, _ = field_lookup.value_to_python(JobTemplate, 'project__name__iexact', 'foo')
|
||||
assert 'foo' in value
|
||||
|
||||
|
||||
def test_invalid_iexact():
|
||||
field_lookup = FieldLookupBackend()
|
||||
with pytest.raises(ValueError) as excinfo:
|
||||
field_lookup.value_to_python(Job, 'id__iexact', '1')
|
||||
assert 'is not a text field and cannot be filtered by case-insensitive search' in str(excinfo.value)
|
||||
|
||||
|
||||
@pytest.mark.parametrize('lookup_suffix', ['', 'contains', 'startswith', 'in'])
|
||||
@pytest.mark.parametrize('password_field', Credential.PASSWORD_FIELDS)
|
||||
def test_filter_on_password_field(password_field, lookup_suffix):
|
||||
|
||||
@@ -1537,11 +1537,9 @@ register(
|
||||
('is_superuser_attr', 'saml_attr'),
|
||||
('is_superuser_value', 'value'),
|
||||
('is_superuser_role', 'saml_role'),
|
||||
('remove_superusers', True),
|
||||
('is_system_auditor_attr', 'saml_attr'),
|
||||
('is_system_auditor_value', 'value'),
|
||||
('is_system_auditor_role', 'saml_role'),
|
||||
('remove_system_auditors', True),
|
||||
],
|
||||
)
|
||||
|
||||
|
||||
@@ -743,10 +743,8 @@ class SAMLUserFlagsAttrField(HybridDictField):
|
||||
is_superuser_attr = fields.CharField(required=False, allow_null=True)
|
||||
is_superuser_value = fields.CharField(required=False, allow_null=True)
|
||||
is_superuser_role = fields.CharField(required=False, allow_null=True)
|
||||
remove_superusers = fields.BooleanField(required=False, allow_null=True)
|
||||
is_system_auditor_attr = fields.CharField(required=False, allow_null=True)
|
||||
is_system_auditor_value = fields.CharField(required=False, allow_null=True)
|
||||
is_system_auditor_role = fields.CharField(required=False, allow_null=True)
|
||||
remove_system_auditors = fields.BooleanField(required=False, allow_null=True)
|
||||
|
||||
child = _Forbidden()
|
||||
|
||||
@@ -77,21 +77,6 @@ def _update_m2m_from_expression(user, related, expr, remove=True):
|
||||
related.remove(user)
|
||||
|
||||
|
||||
def get_or_create_with_default_galaxy_cred(**kwargs):
|
||||
from awx.main.models import Organization, Credential
|
||||
|
||||
(org, org_created) = Organization.objects.get_or_create(**kwargs)
|
||||
if org_created:
|
||||
logger.debug("Created org {} (id {}) from {}".format(org.name, org.id, kwargs))
|
||||
public_galaxy_credential = Credential.objects.filter(managed=True, name='Ansible Galaxy').first()
|
||||
if public_galaxy_credential is not None:
|
||||
org.galaxy_credentials.add(public_galaxy_credential)
|
||||
logger.debug("Added default Ansible Galaxy credential to org")
|
||||
else:
|
||||
logger.debug("Could not find default Ansible Galaxy credential to add to org")
|
||||
return org
|
||||
|
||||
|
||||
def _update_org_from_attr(user, related, attr, remove, remove_admins, remove_auditors, backend):
|
||||
from awx.main.models import Organization
|
||||
from django.conf import settings
|
||||
@@ -109,7 +94,8 @@ def _update_org_from_attr(user, related, attr, remove, remove_admins, remove_aud
|
||||
organization_name = org_name
|
||||
except Exception:
|
||||
organization_name = org_name
|
||||
org = get_or_create_with_default_galaxy_cred(name=organization_name)
|
||||
org = Organization.objects.get_or_create(name=organization_name)[0]
|
||||
org.create_default_galaxy_credential()
|
||||
else:
|
||||
org = Organization.objects.get(name=org_name)
|
||||
except ObjectDoesNotExist:
|
||||
@@ -135,6 +121,7 @@ def update_user_orgs(backend, details, user=None, *args, **kwargs):
|
||||
"""
|
||||
if not user:
|
||||
return
|
||||
from awx.main.models import Organization
|
||||
|
||||
org_map = backend.setting('ORGANIZATION_MAP') or {}
|
||||
for org_name, org_opts in org_map.items():
|
||||
@@ -143,7 +130,8 @@ def update_user_orgs(backend, details, user=None, *args, **kwargs):
|
||||
organization_name = organization_alias
|
||||
else:
|
||||
organization_name = org_name
|
||||
org = get_or_create_with_default_galaxy_cred(name=organization_name)
|
||||
org = Organization.objects.get_or_create(name=organization_name)[0]
|
||||
org.create_default_galaxy_credential()
|
||||
|
||||
# Update org admins from expression(s).
|
||||
remove = bool(org_opts.get('remove', True))
|
||||
@@ -164,14 +152,15 @@ def update_user_teams(backend, details, user=None, *args, **kwargs):
|
||||
"""
|
||||
if not user:
|
||||
return
|
||||
from awx.main.models import Team
|
||||
from awx.main.models import Organization, Team
|
||||
|
||||
team_map = backend.setting('TEAM_MAP') or {}
|
||||
for team_name, team_opts in team_map.items():
|
||||
# Get or create the org to update.
|
||||
if 'organization' not in team_opts:
|
||||
continue
|
||||
org = get_or_create_with_default_galaxy_cred(name=team_opts['organization'])
|
||||
org = Organization.objects.get_or_create(name=team_opts['organization'])[0]
|
||||
org.create_default_galaxy_credential()
|
||||
|
||||
# Update team members from expression(s).
|
||||
team = Team.objects.get_or_create(name=team_name, organization=org)[0]
|
||||
@@ -227,7 +216,8 @@ def update_user_teams_by_saml_attr(backend, details, user=None, *args, **kwargs)
|
||||
|
||||
try:
|
||||
if settings.SAML_AUTO_CREATE_OBJECTS:
|
||||
org = get_or_create_with_default_galaxy_cred(name=organization_name)
|
||||
org = Organization.objects.get_or_create(name=organization_name)[0]
|
||||
org.create_default_galaxy_credential()
|
||||
else:
|
||||
org = Organization.objects.get(name=organization_name)
|
||||
except ObjectDoesNotExist:
|
||||
@@ -255,7 +245,6 @@ def _check_flag(user, flag, attributes, user_flags_settings):
|
||||
is_role_key = "is_%s_role" % (flag)
|
||||
is_attr_key = "is_%s_attr" % (flag)
|
||||
is_value_key = "is_%s_value" % (flag)
|
||||
remove_setting = "remove_%ss" % (flag)
|
||||
|
||||
# Check to see if we are respecting a role and, if so, does our user have that role?
|
||||
role_setting = user_flags_settings.get(is_role_key, None)
|
||||
@@ -287,7 +276,7 @@ def _check_flag(user, flag, attributes, user_flags_settings):
|
||||
# if they don't match make sure that new_flag is false
|
||||
else:
|
||||
logger.debug(
|
||||
"For %s on %s attr %s (%s) did not match expected value '%s'"
|
||||
"Refusing %s for %s because attr %s (%s) did not match value '%s'"
|
||||
% (flag, user.username, attr_setting, attribute_value, user_flags_settings.get(is_value_key))
|
||||
)
|
||||
new_flag = False
|
||||
@@ -296,16 +285,8 @@ def _check_flag(user, flag, attributes, user_flags_settings):
|
||||
logger.debug("Giving %s %s from attribute %s" % (user.username, flag, attr_setting))
|
||||
new_flag = True
|
||||
|
||||
# Get the users old flag
|
||||
old_value = getattr(user, "is_%s" % (flag))
|
||||
|
||||
# If we are not removing the flag and they were a system admin and now we don't want them to be just return
|
||||
remove_flag = user_flags_settings.get(remove_setting, True)
|
||||
if not remove_flag and (old_value and not new_flag):
|
||||
logger.debug("Remove flag %s preventing removal of %s for %s" % (remove_flag, flag, user.username))
|
||||
return old_value, False
|
||||
|
||||
# If the user was flagged and we are going to make them not flagged make sure there is a message
|
||||
old_value = getattr(user, "is_%s" % (flag))
|
||||
if old_value and not new_flag:
|
||||
logger.debug("Revoking %s from %s" % (flag, user.username))
|
||||
|
||||
|
||||
@@ -4,8 +4,8 @@ from unittest import mock
|
||||
|
||||
from django.utils.timezone import now
|
||||
|
||||
from awx.conf.registry import settings_registry
|
||||
from awx.sso.pipeline import update_user_orgs, update_user_teams, update_user_orgs_by_saml_attr, update_user_teams_by_saml_attr, _check_flag
|
||||
|
||||
from awx.main.models import User, Team, Organization, Credential, CredentialType
|
||||
|
||||
|
||||
@@ -92,13 +92,8 @@ class TestSAMLMap:
|
||||
assert Organization.objects.get(name="Default_Alias") is not None
|
||||
|
||||
for o in Organization.objects.all():
|
||||
if o.name == 'Default':
|
||||
# The default org was already created and should not have a galaxy credential
|
||||
assert o.galaxy_credentials.count() == 0
|
||||
else:
|
||||
# The Default_Alias was created by SAML and should get the galaxy credential
|
||||
assert o.galaxy_credentials.count() == 1
|
||||
assert o.galaxy_credentials.first().name == 'Ansible Galaxy'
|
||||
assert o.galaxy_credentials.count() == 1
|
||||
assert o.galaxy_credentials.first().name == 'Ansible Galaxy'
|
||||
|
||||
def test_update_user_teams(self, backend, users, galaxy_credential):
|
||||
u1, u2, u3 = users
|
||||
@@ -208,13 +203,7 @@ class TestSAMLAttr:
|
||||
],
|
||||
}
|
||||
|
||||
mock_settings_obj = MockSettings()
|
||||
for key in settings_registry.get_registered_settings(category_slug='logging'):
|
||||
value = settings_registry.get_setting_field(key).get_default()
|
||||
setattr(mock_settings_obj, key, value)
|
||||
setattr(mock_settings_obj, 'DEBUG', True)
|
||||
|
||||
return mock_settings_obj
|
||||
return MockSettings()
|
||||
|
||||
@pytest.fixture
|
||||
def backend(self):
|
||||
@@ -274,13 +263,8 @@ class TestSAMLAttr:
|
||||
assert Organization.objects.get(name="o1_alias").member_role.members.count() == 1
|
||||
|
||||
for o in Organization.objects.all():
|
||||
if o.id in [o1.id, o2.id, o3.id]:
|
||||
# o[123] were created without a default galaxy cred
|
||||
assert o.galaxy_credentials.count() == 0
|
||||
else:
|
||||
# anything else created should have a default galaxy cred
|
||||
assert o.galaxy_credentials.count() == 1
|
||||
assert o.galaxy_credentials.first().name == 'Ansible Galaxy'
|
||||
assert o.galaxy_credentials.count() == 1
|
||||
assert o.galaxy_credentials.first().name == 'Ansible Galaxy'
|
||||
|
||||
def test_update_user_teams_by_saml_attr(self, orgs, users, galaxy_credential, kwargs, mock_settings):
|
||||
with mock.patch('django.conf.settings', mock_settings):
|
||||
@@ -338,13 +322,8 @@ class TestSAMLAttr:
|
||||
assert Team.objects.get(name='Green', organization__name='Default3').member_role.members.count() == 3
|
||||
|
||||
for o in Organization.objects.all():
|
||||
if o.id in [o1.id, o2.id, o3.id]:
|
||||
# o[123] were created without a default galaxy cred
|
||||
assert o.galaxy_credentials.count() == 0
|
||||
else:
|
||||
# anything else created should have a default galaxy cred
|
||||
assert o.galaxy_credentials.count() == 1
|
||||
assert o.galaxy_credentials.first().name == 'Ansible Galaxy'
|
||||
assert o.galaxy_credentials.count() == 1
|
||||
assert o.galaxy_credentials.first().name == 'Ansible Galaxy'
|
||||
|
||||
def test_update_user_teams_alias_by_saml_attr(self, orgs, users, galaxy_credential, kwargs, mock_settings):
|
||||
with mock.patch('django.conf.settings', mock_settings):
|
||||
@@ -417,113 +396,73 @@ class TestSAMLAttr:
|
||||
assert o.galaxy_credentials.count() == 1
|
||||
assert o.galaxy_credentials.first().name == 'Ansible Galaxy'
|
||||
|
||||
def test_galaxy_credential_no_auto_assign(self, users, kwargs, galaxy_credential, mock_settings):
|
||||
# A Galaxy credential should not be added to an existing org
|
||||
o = Organization.objects.create(name='Default1')
|
||||
o = Organization.objects.create(name='Default2')
|
||||
o = Organization.objects.create(name='Default3')
|
||||
o = Organization.objects.create(name='Default4')
|
||||
kwargs['response']['attributes']['memberOf'] = ['Default1']
|
||||
kwargs['response']['attributes']['groups'] = ['Blue']
|
||||
with mock.patch('django.conf.settings', mock_settings):
|
||||
for u in users:
|
||||
update_user_orgs_by_saml_attr(None, None, u, **kwargs)
|
||||
update_user_teams_by_saml_attr(None, None, u, **kwargs)
|
||||
|
||||
assert Organization.objects.count() == 4
|
||||
for o in Organization.objects.all():
|
||||
assert o.galaxy_credentials.count() == 0
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
class TestSAMLUserFlags:
|
||||
@pytest.mark.parametrize(
|
||||
"user_flags_settings, expected, is_superuser",
|
||||
"user_flags_settings, expected",
|
||||
[
|
||||
# In this case we will pass no user flags so new_flag should be false and changed will def be false
|
||||
(
|
||||
{},
|
||||
(False, False),
|
||||
False,
|
||||
),
|
||||
# In this case we will give the user a group to make them an admin
|
||||
(
|
||||
{'is_superuser_role': 'test-role-1'},
|
||||
(True, True),
|
||||
False,
|
||||
),
|
||||
# In this case we will give the user a flag that will make then an admin
|
||||
(
|
||||
{'is_superuser_attr': 'is_superuser'},
|
||||
(True, True),
|
||||
False,
|
||||
),
|
||||
# In this case we will give the user a flag but the wrong value
|
||||
(
|
||||
{'is_superuser_attr': 'is_superuser', 'is_superuser_value': 'junk'},
|
||||
(False, False),
|
||||
False,
|
||||
),
|
||||
# In this case we will give the user a flag and the right value
|
||||
(
|
||||
{'is_superuser_attr': 'is_superuser', 'is_superuser_value': 'true'},
|
||||
(True, True),
|
||||
False,
|
||||
),
|
||||
# In this case we will give the user a proper role and an is_superuser_attr role that they dont have, this should make them an admin
|
||||
(
|
||||
{'is_superuser_role': 'test-role-1', 'is_superuser_attr': 'gibberish', 'is_superuser_value': 'true'},
|
||||
(True, True),
|
||||
False,
|
||||
),
|
||||
# In this case we will give the user a proper role and an is_superuser_attr role that they have, this should make them an admin
|
||||
(
|
||||
{'is_superuser_role': 'test-role-1', 'is_superuser_attr': 'test-role-1'},
|
||||
(True, True),
|
||||
False,
|
||||
),
|
||||
# In this case we will give the user a proper role and an is_superuser_attr role that they have but a bad value, this should make them an admin
|
||||
(
|
||||
{'is_superuser_role': 'test-role-1', 'is_superuser_attr': 'is_superuser', 'is_superuser_value': 'junk'},
|
||||
(False, False),
|
||||
False,
|
||||
),
|
||||
# In this case we will give the user everything
|
||||
(
|
||||
{'is_superuser_role': 'test-role-1', 'is_superuser_attr': 'is_superuser', 'is_superuser_value': 'true'},
|
||||
(True, True),
|
||||
False,
|
||||
),
|
||||
# In this test case we will validate that a single attribute (instead of a list) still works
|
||||
(
|
||||
{'is_superuser_attr': 'name_id', 'is_superuser_value': 'test_id'},
|
||||
(True, True),
|
||||
False,
|
||||
),
|
||||
# This will be a negative test for a single atrribute
|
||||
(
|
||||
{'is_superuser_attr': 'name_id', 'is_superuser_value': 'junk'},
|
||||
(False, False),
|
||||
False,
|
||||
),
|
||||
# The user is already a superuser so we should remove them
|
||||
(
|
||||
{'is_superuser_attr': 'name_id', 'is_superuser_value': 'junk', 'remove_superusers': True},
|
||||
(False, True),
|
||||
True,
|
||||
),
|
||||
# The user is already a superuser but we don't have a remove field
|
||||
(
|
||||
{'is_superuser_attr': 'name_id', 'is_superuser_value': 'junk', 'remove_superusers': False},
|
||||
(True, False),
|
||||
True,
|
||||
),
|
||||
],
|
||||
)
|
||||
def test__check_flag(self, user_flags_settings, expected, is_superuser):
|
||||
def test__check_flag(self, user_flags_settings, expected):
|
||||
user = User()
|
||||
user.username = 'John'
|
||||
user.is_superuser = is_superuser
|
||||
user.is_superuser = False
|
||||
|
||||
attributes = {
|
||||
'email': ['noone@nowhere.com'],
|
||||
|
||||
@@ -123,11 +123,9 @@ class TestSAMLUserFlagsAttrField:
|
||||
{'is_superuser_attr': 'something'},
|
||||
{'is_superuser_value': 'value'},
|
||||
{'is_superuser_role': 'my_peeps'},
|
||||
{'remove_superusers': False},
|
||||
{'is_system_auditor_attr': 'something_else'},
|
||||
{'is_system_auditor_value': 'value2'},
|
||||
{'is_system_auditor_role': 'other_peeps'},
|
||||
{'remove_system_auditors': False},
|
||||
],
|
||||
)
|
||||
def test_internal_value_valid(self, data):
|
||||
@@ -167,17 +165,6 @@ class TestSAMLUserFlagsAttrField:
|
||||
'junk2': ['Invalid field.'],
|
||||
},
|
||||
),
|
||||
# make sure we can't pass a string to the boolean fields
|
||||
(
|
||||
{
|
||||
'remove_superusers': 'test',
|
||||
'remove_system_auditors': 'test',
|
||||
},
|
||||
{
|
||||
"remove_superusers": ["Must be a valid boolean."],
|
||||
"remove_system_auditors": ["Must be a valid boolean."],
|
||||
},
|
||||
),
|
||||
],
|
||||
)
|
||||
def test_internal_value_invalid(self, data, expected):
|
||||
|
||||
@@ -24,7 +24,7 @@
|
||||
</script>
|
||||
<meta
|
||||
http-equiv="Content-Security-Policy"
|
||||
content="default-src 'self'; connect-src 'self' ws: wss:; style-src 'self' 'unsafe-inline'; script-src 'self' 'nonce-{{ csp_nonce }}' *.pendo.io; img-src 'self' *.pendo.io data:; worker-src 'self' blob: ;"
|
||||
content="default-src 'self'; connect-src 'self' ws: wss:; style-src 'self' 'unsafe-inline'; script-src 'self' 'nonce-{{ csp_nonce }}' *.pendo.io https://d3js.org; img-src 'self' *.pendo.io data:; worker-src 'self' blob: ;"
|
||||
/>
|
||||
<link rel="shortcut icon" href="{% static 'media/favicon.ico' %}" />
|
||||
<% } else { %>
|
||||
|
||||
@@ -1,3 +0,0 @@
|
||||
/* eslint-disable */
|
||||
// https://d3js.org/d3-collection/ v1.0.7 Copyright 2018 Mike Bostock
|
||||
!function(n,t){"object"==typeof exports&&"undefined"!=typeof module?t(exports):"function"==typeof define&&define.amd?define(["exports"],t):t(n.d3=n.d3||{})}(this,function(n){"use strict";function t(){}function e(n,e){var r=new t;if(n instanceof t)n.each(function(n,t){r.set(t,n)});else if(Array.isArray(n)){var i,u=-1,o=n.length;if(null==e)for(;++u<o;)r.set(u,n[u]);else for(;++u<o;)r.set(e(i=n[u],u,n),i)}else if(n)for(var s in n)r.set(s,n[s]);return r}function r(){return{}}function i(n,t,e){n[t]=e}function u(){return e()}function o(n,t,e){n.set(t,e)}function s(){}t.prototype=e.prototype={constructor:t,has:function(n){return"$"+n in this},get:function(n){return this["$"+n]},set:function(n,t){return this["$"+n]=t,this},remove:function(n){var t="$"+n;return t in this&&delete this[t]},clear:function(){for(var n in this)"$"===n[0]&&delete this[n]},keys:function(){var n=[];for(var t in this)"$"===t[0]&&n.push(t.slice(1));return n},values:function(){var n=[];for(var t in this)"$"===t[0]&&n.push(this[t]);return n},entries:function(){var n=[];for(var t in this)"$"===t[0]&&n.push({key:t.slice(1),value:this[t]});return n},size:function(){var n=0;for(var t in this)"$"===t[0]&&++n;return n},empty:function(){for(var n in this)if("$"===n[0])return!1;return!0},each:function(n){for(var t in this)"$"===t[0]&&n(this[t],t.slice(1),this)}};var f=e.prototype;function c(n,t){var e=new s;if(n instanceof s)n.each(function(n){e.add(n)});else if(n){var r=-1,i=n.length;if(null==t)for(;++r<i;)e.add(n[r]);else for(;++r<i;)e.add(t(n[r],r,n))}return e}s.prototype=c.prototype={constructor:s,has:f.has,add:function(n){return this["$"+(n+="")]=n,this},remove:f.remove,clear:f.clear,values:f.keys,size:f.size,empty:f.empty,each:f.each},n.nest=function(){var n,t,s,f=[],c=[];function a(r,i,u,o){if(i>=f.length)return null!=n&&r.sort(n),null!=t?t(r):r;for(var s,c,h,l=-1,v=r.length,p=f[i++],y=e(),d=u();++l<v;)(h=y.get(s=p(c=r[l])+""))?h.push(c):y.set(s,[c]);return y.each(function(n,t){o(d,t,a(n,i,u,o))}),d}return s={object:function(n){return a(n,0,r,i)},map:function(n){return a(n,0,u,o)},entries:function(n){return function n(e,r){if(++r>f.length)return e;var i,u=c[r-1];return null!=t&&r>=f.length?i=e.entries():(i=[],e.each(function(t,e){i.push({key:e,values:n(t,r)})})),null!=u?i.sort(function(n,t){return u(n.key,t.key)}):i}(a(n,0,u,o),0)},key:function(n){return f.push(n),s},sortKeys:function(n){return c[f.length-1]=n,s},sortValues:function(t){return n=t,s},rollup:function(n){return t=n,s}}},n.set=c,n.map=e,n.keys=function(n){var t=[];for(var e in n)t.push(e);return t},n.values=function(n){var t=[];for(var e in n)t.push(n[e]);return t},n.entries=function(n){var t=[];for(var e in n)t.push({key:e,value:n[e]});return t},Object.defineProperty(n,"__esModule",{value:!0})});
|
||||
@@ -1,3 +0,0 @@
|
||||
/* eslint-disable */
|
||||
// https://d3js.org/d3-dispatch/ v1.0.6 Copyright 2019 Mike Bostock
|
||||
!function(n,e){"object"==typeof exports&&"undefined"!=typeof module?e(exports):"function"==typeof define&&define.amd?define(["exports"],e):e((n=n||self).d3=n.d3||{})}(this,function(n){"use strict";var e={value:function(){}};function t(){for(var n,e=0,t=arguments.length,o={};e<t;++e){if(!(n=arguments[e]+"")||n in o||/[\s.]/.test(n))throw new Error("illegal type: "+n);o[n]=[]}return new r(o)}function r(n){this._=n}function o(n,e){return n.trim().split(/^|\s+/).map(function(n){var t="",r=n.indexOf(".");if(r>=0&&(t=n.slice(r+1),n=n.slice(0,r)),n&&!e.hasOwnProperty(n))throw new Error("unknown type: "+n);return{type:n,name:t}})}function i(n,e){for(var t,r=0,o=n.length;r<o;++r)if((t=n[r]).name===e)return t.value}function f(n,t,r){for(var o=0,i=n.length;o<i;++o)if(n[o].name===t){n[o]=e,n=n.slice(0,o).concat(n.slice(o+1));break}return null!=r&&n.push({name:t,value:r}),n}r.prototype=t.prototype={constructor:r,on:function(n,e){var t,r=this._,l=o(n+"",r),u=-1,a=l.length;if(!(arguments.length<2)){if(null!=e&&"function"!=typeof e)throw new Error("invalid callback: "+e);for(;++u<a;)if(t=(n=l[u]).type)r[t]=f(r[t],n.name,e);else if(null==e)for(t in r)r[t]=f(r[t],n.name,null);return this}for(;++u<a;)if((t=(n=l[u]).type)&&(t=i(r[t],n.name)))return t},copy:function(){var n={},e=this._;for(var t in e)n[t]=e[t].slice();return new r(n)},call:function(n,e){if((t=arguments.length-2)>0)for(var t,r,o=new Array(t),i=0;i<t;++i)o[i]=arguments[i+2];if(!this._.hasOwnProperty(n))throw new Error("unknown type: "+n);for(i=0,t=(r=this._[n]).length;i<t;++i)r[i].value.apply(e,o)},apply:function(n,e,t){if(!this._.hasOwnProperty(n))throw new Error("unknown type: "+n);for(var r=this._[n],o=0,i=r.length;o<i;++o)r[o].value.apply(e,t)}},n.dispatch=t,Object.defineProperty(n,"__esModule",{value:!0})});
|
||||
3
awx/ui/public/static/js/d3-force.v1.min.js
vendored
3
awx/ui/public/static/js/d3-force.v1.min.js
vendored
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
3
awx/ui/public/static/js/d3-timer.v1.min.js
vendored
3
awx/ui/public/static/js/d3-timer.v1.min.js
vendored
@@ -1,3 +0,0 @@
|
||||
/* eslint-disable */
|
||||
// https://d3js.org/d3-timer/ v1.0.10 Copyright 2019 Mike Bostock
|
||||
!function(t,n){"object"==typeof exports&&"undefined"!=typeof module?n(exports):"function"==typeof define&&define.amd?define(["exports"],n):n((t=t||self).d3=t.d3||{})}(this,function(t){"use strict";var n,e,o=0,i=0,r=0,u=1e3,l=0,c=0,f=0,a="object"==typeof performance&&performance.now?performance:Date,s="object"==typeof window&&window.requestAnimationFrame?window.requestAnimationFrame.bind(window):function(t){setTimeout(t,17)};function _(){return c||(s(m),c=a.now()+f)}function m(){c=0}function p(){this._call=this._time=this._next=null}function w(t,n,e){var o=new p;return o.restart(t,n,e),o}function d(){_(),++o;for(var t,e=n;e;)(t=c-e._time)>=0&&e._call.call(null,t),e=e._next;--o}function h(){c=(l=a.now())+f,o=i=0;try{d()}finally{o=0,function(){var t,o,i=n,r=1/0;for(;i;)i._call?(r>i._time&&(r=i._time),t=i,i=i._next):(o=i._next,i._next=null,i=t?t._next=o:n=o);e=t,v(r)}(),c=0}}function y(){var t=a.now(),n=t-l;n>u&&(f-=n,l=t)}function v(t){o||(i&&(i=clearTimeout(i)),t-c>24?(t<1/0&&(i=setTimeout(h,t-a.now()-f)),r&&(r=clearInterval(r))):(r||(l=a.now(),r=setInterval(y,u)),o=1,s(h)))}p.prototype=w.prototype={constructor:p,restart:function(t,o,i){if("function"!=typeof t)throw new TypeError("callback is not a function");i=(null==i?_():+i)+(null==o?0:+o),this._next||e===this||(e?e._next=this:n=this,e=this),this._call=t,this._time=i,v()},stop:function(){this._call&&(this._call=null,this._time=1/0,v())}},t.interval=function(t,n,e){var o=new p,i=n;return null==n?(o.restart(t,n,e),o):(n=+n,e=null==e?_():+e,o.restart(function r(u){u+=i,o.restart(r,i+=n,e),t(u)},n,e),o)},t.now=_,t.timeout=function(t,n,e){var o=new p;return n=null==n?0:+n,o.restart(function(e){o.stop(),t(e+n)},n,e),o},t.timer=w,t.timerFlush=d,Object.defineProperty(t,"__esModule",{value:!0})});
|
||||
@@ -41,7 +41,6 @@ const Detail = ({
|
||||
className,
|
||||
dataCy,
|
||||
alwaysVisible,
|
||||
isEmpty,
|
||||
helpText,
|
||||
isEncrypted,
|
||||
isNotConfigured,
|
||||
@@ -50,10 +49,6 @@ const Detail = ({
|
||||
return null;
|
||||
}
|
||||
|
||||
if (isEmpty && !alwaysVisible) {
|
||||
return null;
|
||||
}
|
||||
|
||||
const labelCy = dataCy ? `${dataCy}-label` : null;
|
||||
const valueCy = dataCy ? `${dataCy}-value` : null;
|
||||
|
||||
|
||||
@@ -163,16 +163,16 @@ function JobListItem({
|
||||
<Td colSpan={showTypeColumn ? 6 : 5}>
|
||||
<ExpandableRowContent>
|
||||
<DetailList>
|
||||
{job.type === 'inventory_update' && (
|
||||
<Detail
|
||||
dataCy="job-inventory-source-type"
|
||||
label={t`Source`}
|
||||
value={inventorySourceLabels?.map(([string, label]) =>
|
||||
string === job.source ? label : null
|
||||
)}
|
||||
isEmpty={inventorySourceLabels?.length === 0}
|
||||
/>
|
||||
)}
|
||||
{job.type === 'inventory_update' &&
|
||||
inventorySourceLabels.length > 0 && (
|
||||
<Detail
|
||||
dataCy="job-inventory-source-type"
|
||||
label={t`Source`}
|
||||
value={inventorySourceLabels.map(([string, label]) =>
|
||||
string === job.source ? label : null
|
||||
)}
|
||||
/>
|
||||
)}
|
||||
<LaunchedByDetail job={job} />
|
||||
{job.launch_type === 'scheduled' &&
|
||||
(schedule ? (
|
||||
@@ -254,7 +254,7 @@ function JobListItem({
|
||||
dataCy={`execution-environment-detail-${job.id}`}
|
||||
/>
|
||||
)}
|
||||
{credentials && (
|
||||
{credentials && credentials.length > 0 && (
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Credentials`}
|
||||
@@ -275,7 +275,6 @@ function JobListItem({
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
isEmpty={credentials.length === 0}
|
||||
/>
|
||||
)}
|
||||
{labels && labels.count > 0 && (
|
||||
|
||||
@@ -203,49 +203,6 @@ describe('<JobListItem />', () => {
|
||||
wrapper.find('Detail[label="Execution Environment"] dd').text()
|
||||
).toBe('Missing resource');
|
||||
});
|
||||
|
||||
test('should not load Source', () => {
|
||||
wrapper = mountWithContexts(
|
||||
<table>
|
||||
<tbody>
|
||||
<JobListItem
|
||||
inventorySourceLabels={[]}
|
||||
job={{
|
||||
...mockJob,
|
||||
type: 'inventory_update',
|
||||
summary_fields: {
|
||||
user_capabilities: {},
|
||||
},
|
||||
}}
|
||||
/>
|
||||
</tbody>
|
||||
</table>
|
||||
);
|
||||
const source_detail = wrapper.find(`Detail[label="Source"]`).at(0);
|
||||
expect(source_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
|
||||
test('should not load Credentials', () => {
|
||||
wrapper = mountWithContexts(
|
||||
<table>
|
||||
<tbody>
|
||||
<JobListItem
|
||||
job={{
|
||||
...mockJob,
|
||||
type: 'inventory_update',
|
||||
summary_fields: {
|
||||
credentials: [],
|
||||
},
|
||||
}}
|
||||
/>
|
||||
</tbody>
|
||||
</table>
|
||||
);
|
||||
const credentials_detail = wrapper
|
||||
.find(`Detail[label="Credentials"]`)
|
||||
.at(0);
|
||||
expect(credentials_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
});
|
||||
|
||||
describe('<JobListItem with failed job />', () => {
|
||||
|
||||
@@ -29,11 +29,17 @@ function PromptInventorySourceDetail({ resource }) {
|
||||
summary_fields,
|
||||
update_cache_timeout,
|
||||
update_on_launch,
|
||||
update_on_project_update,
|
||||
verbosity,
|
||||
} = resource;
|
||||
|
||||
let optionsList = '';
|
||||
if (overwrite || overwrite_vars || update_on_launch) {
|
||||
if (
|
||||
overwrite ||
|
||||
overwrite_vars ||
|
||||
update_on_launch ||
|
||||
update_on_project_update
|
||||
) {
|
||||
optionsList = (
|
||||
<TextList component={TextListVariants.ul}>
|
||||
{overwrite && (
|
||||
@@ -51,6 +57,11 @@ function PromptInventorySourceDetail({ resource }) {
|
||||
{t`Update on launch`}
|
||||
</TextListItem>
|
||||
)}
|
||||
{update_on_project_update && (
|
||||
<TextListItem component={TextListItemVariants.li}>
|
||||
{t`Update on project update`}
|
||||
</TextListItem>
|
||||
)}
|
||||
</TextList>
|
||||
);
|
||||
}
|
||||
@@ -102,14 +113,15 @@ function PromptInventorySourceDetail({ resource }) {
|
||||
label={t`Cache Timeout`}
|
||||
value={`${update_cache_timeout} ${t`Seconds`}`}
|
||||
/>
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Credential`}
|
||||
value={summary_fields?.credentials?.map((cred) => (
|
||||
<CredentialChip key={cred?.id} credential={cred} isReadOnly />
|
||||
))}
|
||||
isEmpty={summary_fields?.credentials?.length === 0}
|
||||
/>
|
||||
{summary_fields?.credentials?.length > 0 && (
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Credential`}
|
||||
value={summary_fields.credentials.map((cred) => (
|
||||
<CredentialChip key={cred?.id} credential={cred} isReadOnly />
|
||||
))}
|
||||
/>
|
||||
)}
|
||||
{source_regions && (
|
||||
<Detail
|
||||
fullWidth
|
||||
|
||||
@@ -67,6 +67,7 @@ describe('PromptInventorySourceDetail', () => {
|
||||
</li>,
|
||||
<li>Overwrite local variables from remote inventory source</li>,
|
||||
<li>Update on launch</li>,
|
||||
<li>Update on project update</li>,
|
||||
])
|
||||
).toEqual(true);
|
||||
});
|
||||
@@ -78,19 +79,4 @@ describe('PromptInventorySourceDetail', () => {
|
||||
);
|
||||
assertDetail(wrapper, 'Organization', 'Deleted');
|
||||
});
|
||||
|
||||
test('should not load Credentials', () => {
|
||||
wrapper = mountWithContexts(
|
||||
<PromptInventorySourceDetail
|
||||
resource={{
|
||||
...mockInvSource,
|
||||
summary_fields: {
|
||||
credentials: [],
|
||||
},
|
||||
}}
|
||||
/>
|
||||
);
|
||||
const credentials_detail = wrapper.find(`Detail[label="Credential"]`).at(0);
|
||||
expect(credentials_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
});
|
||||
|
||||
@@ -26,7 +26,7 @@ function PromptJobTemplateDetail({ resource }) {
|
||||
extra_vars,
|
||||
forks,
|
||||
host_config_key,
|
||||
instance_groups = [],
|
||||
instance_groups,
|
||||
job_slice_count,
|
||||
job_tags,
|
||||
job_type,
|
||||
@@ -94,11 +94,9 @@ function PromptJobTemplateDetail({ resource }) {
|
||||
|
||||
return (
|
||||
<>
|
||||
<Detail
|
||||
label={t`Activity`}
|
||||
value={<Sparkline jobs={recentJobs} />}
|
||||
isEmpty={summary_fields.recent_jobs?.length === 0}
|
||||
/>
|
||||
{summary_fields.recent_jobs?.length > 0 && (
|
||||
<Detail value={<Sparkline jobs={recentJobs} />} label={t`Activity`} />
|
||||
)}
|
||||
<Detail label={t`Job Type`} value={toTitleCase(job_type)} />
|
||||
{summary_fields?.organization ? (
|
||||
<Detail
|
||||
@@ -182,7 +180,7 @@ function PromptJobTemplateDetail({ resource }) {
|
||||
/>
|
||||
)}
|
||||
{optionsList && <Detail label={t`Enabled Options`} value={optionsList} />}
|
||||
{summary_fields?.credentials && (
|
||||
{summary_fields?.credentials?.length > 0 && (
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Credentials`}
|
||||
@@ -197,10 +195,9 @@ function PromptJobTemplateDetail({ resource }) {
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
isEmpty={summary_fields?.credentials?.length === 0}
|
||||
/>
|
||||
)}
|
||||
{summary_fields?.labels?.results && (
|
||||
{summary_fields?.labels?.results?.length > 0 && (
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Labels`}
|
||||
@@ -217,28 +214,28 @@ function PromptJobTemplateDetail({ resource }) {
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
isEmpty={summary_fields?.labels?.results?.length === 0}
|
||||
/>
|
||||
)}
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Instance Groups`}
|
||||
value={
|
||||
<ChipGroup
|
||||
numChips={5}
|
||||
totalChips={instance_groups?.length}
|
||||
ouiaId="prompt-jt-instance-group-chips"
|
||||
>
|
||||
{instance_groups?.map((ig) => (
|
||||
<Chip key={ig.id} isReadOnly>
|
||||
{ig.name}
|
||||
</Chip>
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
isEmpty={instance_groups?.length === 0}
|
||||
/>
|
||||
{job_tags && (
|
||||
{instance_groups?.length > 0 && (
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Instance Groups`}
|
||||
value={
|
||||
<ChipGroup
|
||||
numChips={5}
|
||||
totalChips={instance_groups.length}
|
||||
ouiaId="prompt-jt-instance-group-chips"
|
||||
>
|
||||
{instance_groups.map((ig) => (
|
||||
<Chip key={ig.id} isReadOnly>
|
||||
{ig.name}
|
||||
</Chip>
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
/>
|
||||
)}
|
||||
{job_tags?.length > 0 && (
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Job Tags`}
|
||||
@@ -255,10 +252,9 @@ function PromptJobTemplateDetail({ resource }) {
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
isEmpty={job_tags?.length === 0}
|
||||
/>
|
||||
)}
|
||||
{skip_tags && (
|
||||
{skip_tags?.length > 0 && (
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Skip Tags`}
|
||||
@@ -275,7 +271,6 @@ function PromptJobTemplateDetail({ resource }) {
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
isEmpty={skip_tags?.length === 0}
|
||||
/>
|
||||
)}
|
||||
{extra_vars && (
|
||||
|
||||
@@ -125,92 +125,4 @@ describe('PromptJobTemplateDetail', () => {
|
||||
assertDetail(wrapper, 'Organization', 'Deleted');
|
||||
assertDetail(wrapper, 'Project', 'Deleted');
|
||||
});
|
||||
|
||||
test('should not load Activity', () => {
|
||||
wrapper = mountWithContexts(
|
||||
<PromptJobTemplateDetail
|
||||
resource={{
|
||||
...mockJT,
|
||||
summary_fields: {
|
||||
recent_jobs: [],
|
||||
},
|
||||
}}
|
||||
/>
|
||||
);
|
||||
const activity_detail = wrapper.find(`Detail[label="Activity"]`).at(0);
|
||||
expect(activity_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
|
||||
test('should not load Credentials', () => {
|
||||
wrapper = mountWithContexts(
|
||||
<PromptJobTemplateDetail
|
||||
resource={{
|
||||
...mockJT,
|
||||
summary_fields: {
|
||||
credentials: [],
|
||||
},
|
||||
}}
|
||||
/>
|
||||
);
|
||||
const credentials_detail = wrapper
|
||||
.find(`Detail[label="Credentials"]`)
|
||||
.at(0);
|
||||
expect(credentials_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
|
||||
test('should not load Labels', () => {
|
||||
wrapper = mountWithContexts(
|
||||
<PromptJobTemplateDetail
|
||||
resource={{
|
||||
...mockJT,
|
||||
summary_fields: {
|
||||
labels: {
|
||||
results: [],
|
||||
},
|
||||
},
|
||||
}}
|
||||
/>
|
||||
);
|
||||
const labels_detail = wrapper.find(`Detail[label="Labels"]`).at(0);
|
||||
expect(labels_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
|
||||
test('should not load Instance Groups', () => {
|
||||
wrapper = mountWithContexts(
|
||||
<PromptJobTemplateDetail
|
||||
resource={{
|
||||
...mockJT,
|
||||
instance_groups: [],
|
||||
}}
|
||||
/>
|
||||
);
|
||||
const instance_groups_detail = wrapper
|
||||
.find(`Detail[label="Instance Groups"]`)
|
||||
.at(0);
|
||||
expect(instance_groups_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
|
||||
test('should not load Job Tags', () => {
|
||||
wrapper = mountWithContexts(
|
||||
<PromptJobTemplateDetail
|
||||
resource={{
|
||||
...mockJT,
|
||||
job_tags: '',
|
||||
}}
|
||||
/>
|
||||
);
|
||||
expect(wrapper.find('Detail[label="Job Tags"]').length).toBe(0);
|
||||
});
|
||||
|
||||
test('should not load Skip Tags', () => {
|
||||
wrapper = mountWithContexts(
|
||||
<PromptJobTemplateDetail
|
||||
resource={{
|
||||
...mockJT,
|
||||
skip_tags: '',
|
||||
}}
|
||||
/>
|
||||
);
|
||||
expect(wrapper.find('Detail[label="Skip Tags"]').length).toBe(0);
|
||||
});
|
||||
});
|
||||
|
||||
@@ -57,11 +57,9 @@ function PromptWFJobTemplateDetail({ resource }) {
|
||||
|
||||
return (
|
||||
<>
|
||||
<Detail
|
||||
label={t`Activity`}
|
||||
value={<Sparkline jobs={recentJobs} />}
|
||||
isEmpty={summary_fields?.recent_jobs?.length === 0}
|
||||
/>
|
||||
{summary_fields?.recent_jobs?.length > 0 && (
|
||||
<Detail value={<Sparkline jobs={recentJobs} />} label={t`Activity`} />
|
||||
)}
|
||||
{summary_fields?.organization && (
|
||||
<Detail
|
||||
label={t`Organization`}
|
||||
@@ -110,7 +108,7 @@ function PromptWFJobTemplateDetail({ resource }) {
|
||||
}
|
||||
/>
|
||||
)}
|
||||
{summary_fields?.labels?.results && (
|
||||
{summary_fields?.labels?.results?.length > 0 && (
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Labels`}
|
||||
@@ -127,7 +125,6 @@ function PromptWFJobTemplateDetail({ resource }) {
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
isEmpty={summary_fields?.labels?.results?.length === 0}
|
||||
/>
|
||||
)}
|
||||
{extra_vars && (
|
||||
|
||||
@@ -62,36 +62,4 @@ describe('PromptWFJobTemplateDetail', () => {
|
||||
'---\nmock: data'
|
||||
);
|
||||
});
|
||||
|
||||
test('should not load Activity', () => {
|
||||
wrapper = mountWithContexts(
|
||||
<PromptWFJobTemplateDetail
|
||||
resource={{
|
||||
...mockWF,
|
||||
summary_fields: {
|
||||
recent_jobs: [],
|
||||
},
|
||||
}}
|
||||
/>
|
||||
);
|
||||
const activity_detail = wrapper.find(`Detail[label="Activity"]`).at(0);
|
||||
expect(activity_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
|
||||
test('should not load Labels', () => {
|
||||
wrapper = mountWithContexts(
|
||||
<PromptWFJobTemplateDetail
|
||||
resource={{
|
||||
...mockWF,
|
||||
summary_fields: {
|
||||
labels: {
|
||||
results: [],
|
||||
},
|
||||
},
|
||||
}}
|
||||
/>
|
||||
);
|
||||
const labels_detail = wrapper.find(`Detail[label="Labels"]`).at(0);
|
||||
expect(labels_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
});
|
||||
|
||||
@@ -112,6 +112,7 @@
|
||||
"update_on_launch":true,
|
||||
"update_cache_timeout":2,
|
||||
"source_project":8,
|
||||
"update_on_project_update":true,
|
||||
"last_update_failed": true,
|
||||
"last_updated":null
|
||||
}
|
||||
@@ -68,32 +68,34 @@ function ResourceAccessListItem({ accessRecord, onRoleDelete }) {
|
||||
<Td dataLabel={t`Last name`}>{accessRecord.last_name}</Td>
|
||||
<Td dataLabel={t`Roles`}>
|
||||
<DetailList stacked>
|
||||
<Detail
|
||||
label={t`User Roles`}
|
||||
value={
|
||||
<ChipGroup
|
||||
numChips={5}
|
||||
totalChips={userRoles.length}
|
||||
ouiaId="user-role-chips"
|
||||
>
|
||||
{userRoles.map(renderChip)}
|
||||
</ChipGroup>
|
||||
}
|
||||
isEmpty={userRoles.length === 0}
|
||||
/>
|
||||
<Detail
|
||||
label={t`Team Roles`}
|
||||
value={
|
||||
<ChipGroup
|
||||
numChips={5}
|
||||
totalChips={teamRoles.length}
|
||||
ouiaId="team-role-chips"
|
||||
>
|
||||
{teamRoles.map(renderChip)}
|
||||
</ChipGroup>
|
||||
}
|
||||
isEmpty={teamRoles.length === 0}
|
||||
/>
|
||||
{userRoles.length > 0 && (
|
||||
<Detail
|
||||
label={t`User Roles`}
|
||||
value={
|
||||
<ChipGroup
|
||||
numChips={5}
|
||||
totalChips={userRoles.length}
|
||||
ouiaId="user-role-chips"
|
||||
>
|
||||
{userRoles.map(renderChip)}
|
||||
</ChipGroup>
|
||||
}
|
||||
/>
|
||||
)}
|
||||
{teamRoles.length > 0 && (
|
||||
<Detail
|
||||
label={t`Team Roles`}
|
||||
value={
|
||||
<ChipGroup
|
||||
numChips={5}
|
||||
totalChips={teamRoles.length}
|
||||
ouiaId="team-role-chips"
|
||||
>
|
||||
{teamRoles.map(renderChip)}
|
||||
</ChipGroup>
|
||||
}
|
||||
/>
|
||||
)}
|
||||
</DetailList>
|
||||
</Td>
|
||||
</Tr>
|
||||
|
||||
@@ -53,41 +53,5 @@ describe('<ResourceAccessListItem />', () => {
|
||||
|
||||
expect(wrapper.find('Td[dataLabel="First name"]').text()).toBe('jane');
|
||||
expect(wrapper.find('Td[dataLabel="Last name"]').text()).toBe('brown');
|
||||
|
||||
const user_roles_detail = wrapper.find(`Detail[label="User Roles"]`).at(0);
|
||||
expect(user_roles_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
|
||||
test('should not load team roles', async () => {
|
||||
let wrapper;
|
||||
|
||||
await act(async () => {
|
||||
wrapper = mountWithContexts(
|
||||
<table>
|
||||
<tbody>
|
||||
<ResourceAccessListItem
|
||||
accessRecord={{
|
||||
...accessRecord,
|
||||
summary_fields: {
|
||||
direct_access: [
|
||||
{
|
||||
role: {
|
||||
id: 3,
|
||||
name: 'Member',
|
||||
user_capabilities: { unattach: true },
|
||||
},
|
||||
},
|
||||
],
|
||||
indirect_access: [],
|
||||
},
|
||||
}}
|
||||
onRoleDelete={() => {}}
|
||||
/>
|
||||
</tbody>
|
||||
</table>
|
||||
);
|
||||
});
|
||||
const team_roles_detail = wrapper.find(`Detail[label="Team Roles"]`).at(0);
|
||||
expect(team_roles_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
});
|
||||
|
||||
@@ -272,12 +272,11 @@ function TemplateListItem({
|
||||
value={template.description}
|
||||
dataCy={`template-${template.id}-description`}
|
||||
/>
|
||||
{summaryFields.recent_jobs ? (
|
||||
{summaryFields.recent_jobs && summaryFields.recent_jobs.length ? (
|
||||
<Detail
|
||||
label={t`Activity`}
|
||||
value={<Sparkline jobs={summaryFields.recent_jobs} />}
|
||||
dataCy={`template-${template.id}-activity`}
|
||||
isEmpty={summaryFields.recent_jobs.length === 0}
|
||||
/>
|
||||
) : null}
|
||||
{summaryFields.inventory ? (
|
||||
@@ -317,7 +316,7 @@ function TemplateListItem({
|
||||
value={formatDateString(template.modified)}
|
||||
dataCy={`template-${template.id}-last-modified`}
|
||||
/>
|
||||
{summaryFields.credentials ? (
|
||||
{summaryFields.credentials && summaryFields.credentials.length ? (
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Credentials`}
|
||||
@@ -338,10 +337,9 @@ function TemplateListItem({
|
||||
</ChipGroup>
|
||||
}
|
||||
dataCy={`template-${template.id}-credentials`}
|
||||
isEmpty={summaryFields.credentials.length === 0}
|
||||
/>
|
||||
) : null}
|
||||
{summaryFields.labels && (
|
||||
{summaryFields.labels && summaryFields.labels.results.length > 0 && (
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Labels`}
|
||||
@@ -363,7 +361,6 @@ function TemplateListItem({
|
||||
</ChipGroup>
|
||||
}
|
||||
dataCy={`template-${template.id}-labels`}
|
||||
isEmpty={summaryFields.labels.results.length === 0}
|
||||
/>
|
||||
)}
|
||||
</DetailList>
|
||||
|
||||
@@ -465,68 +465,4 @@ describe('<TemplateListItem />', () => {
|
||||
).toEqual(true);
|
||||
expect(wrapper.find(`Detail[label="Activity"] Sparkline`)).toHaveLength(1);
|
||||
});
|
||||
|
||||
test('should not load Activity', async () => {
|
||||
const wrapper = mountWithContexts(
|
||||
<table>
|
||||
<tbody>
|
||||
<TemplateListItem
|
||||
template={{
|
||||
...mockJobTemplateData,
|
||||
summary_fields: {
|
||||
user_capabilities: {},
|
||||
recent_jobs: [],
|
||||
},
|
||||
}}
|
||||
/>
|
||||
</tbody>
|
||||
</table>
|
||||
);
|
||||
const activity_detail = wrapper.find(`Detail[label="Activity"]`).at(0);
|
||||
expect(activity_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
|
||||
test('should not load Credentials', async () => {
|
||||
const wrapper = mountWithContexts(
|
||||
<table>
|
||||
<tbody>
|
||||
<TemplateListItem
|
||||
template={{
|
||||
...mockJobTemplateData,
|
||||
summary_fields: {
|
||||
user_capabilities: {},
|
||||
credentials: [],
|
||||
},
|
||||
}}
|
||||
/>
|
||||
</tbody>
|
||||
</table>
|
||||
);
|
||||
const credentials_detail = wrapper
|
||||
.find(`Detail[label="Credentials"]`)
|
||||
.at(0);
|
||||
expect(credentials_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
|
||||
test('should not load Labels', async () => {
|
||||
const wrapper = mountWithContexts(
|
||||
<table>
|
||||
<tbody>
|
||||
<TemplateListItem
|
||||
template={{
|
||||
...mockJobTemplateData,
|
||||
summary_fields: {
|
||||
user_capabilities: {},
|
||||
labels: {
|
||||
results: [],
|
||||
},
|
||||
},
|
||||
}}
|
||||
/>
|
||||
</tbody>
|
||||
</table>
|
||||
);
|
||||
const labels_detail = wrapper.find(`Detail[label="Labels"]`).at(0);
|
||||
expect(labels_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
});
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
@@ -264,19 +264,20 @@ function CredentialDetail({ credential }) {
|
||||
date={modified}
|
||||
user={modified_by}
|
||||
/>
|
||||
<Detail
|
||||
label={t`Enabled Options`}
|
||||
value={
|
||||
<TextList component={TextListVariants.ul}>
|
||||
{enabledBooleanFields.map(({ id, label }) => (
|
||||
<TextListItem key={id} component={TextListItemVariants.li}>
|
||||
{label}
|
||||
</TextListItem>
|
||||
))}
|
||||
</TextList>
|
||||
}
|
||||
isEmpty={enabledBooleanFields.length === 0}
|
||||
/>
|
||||
{enabledBooleanFields.length > 0 && (
|
||||
<Detail
|
||||
label={t`Enabled Options`}
|
||||
value={
|
||||
<TextList component={TextListVariants.ul}>
|
||||
{enabledBooleanFields.map(({ id, label }) => (
|
||||
<TextListItem key={id} component={TextListItemVariants.li}>
|
||||
{label}
|
||||
</TextListItem>
|
||||
))}
|
||||
</TextList>
|
||||
}
|
||||
/>
|
||||
)}
|
||||
</DetailList>
|
||||
{Object.keys(inputSources).length > 0 && (
|
||||
<PluginFieldText>
|
||||
|
||||
@@ -149,23 +149,4 @@ describe('<CredentialDetail />', () => {
|
||||
wrapper.find('ModalBoxCloseButton').invoke('onClose')();
|
||||
});
|
||||
});
|
||||
|
||||
test('should not load enabled options', async () => {
|
||||
await act(async () => {
|
||||
wrapper = mountWithContexts(
|
||||
<CredentialDetail
|
||||
credential={{
|
||||
...mockCredential,
|
||||
results: {
|
||||
inputs: null,
|
||||
},
|
||||
}}
|
||||
/>
|
||||
);
|
||||
});
|
||||
const enabled_options_detail = wrapper
|
||||
.find(`Detail[label="Enabled Options"]`)
|
||||
.at(0);
|
||||
expect(enabled_options_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
});
|
||||
|
||||
@@ -67,11 +67,9 @@ function HostDetail({ host }) {
|
||||
<HostToggle host={host} css="padding-bottom: 40px" />
|
||||
<DetailList gutter="sm">
|
||||
<Detail label={t`Name`} value={name} dataCy="host-name" />
|
||||
<Detail
|
||||
label={t`Activity`}
|
||||
value={<Sparkline jobs={recentJobs} />}
|
||||
isEmpty={recentJobs?.length === 0}
|
||||
/>
|
||||
{recentJobs?.length > 0 && (
|
||||
<Detail label={t`Activity`} value={<Sparkline jobs={recentJobs} />} />
|
||||
)}
|
||||
<Detail label={t`Description`} value={description} />
|
||||
<Detail
|
||||
label={t`Inventory`}
|
||||
|
||||
@@ -81,8 +81,6 @@ describe('<HostDetail />', () => {
|
||||
expect(wrapper.find(`Detail[label="Activity"] Sparkline`)).toHaveLength(
|
||||
0
|
||||
);
|
||||
const activity_detail = wrapper.find(`Detail[label="Activity"]`).at(0);
|
||||
expect(activity_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
|
||||
test('should hide edit button for users without edit permission', async () => {
|
||||
|
||||
@@ -79,17 +79,17 @@ function InventoryDetail({ inventory }) {
|
||||
}
|
||||
/>
|
||||
<Detail label={t`Total hosts`} value={inventory.total_hosts} />
|
||||
{instanceGroups && (
|
||||
{instanceGroups && instanceGroups.length > 0 && (
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Instance Groups`}
|
||||
value={
|
||||
<ChipGroup
|
||||
numChips={5}
|
||||
totalChips={instanceGroups?.length}
|
||||
totalChips={instanceGroups.length}
|
||||
ouiaId="instance-group-chips"
|
||||
>
|
||||
{instanceGroups?.map((ig) => (
|
||||
{instanceGroups.map((ig) => (
|
||||
<Chip
|
||||
key={ig.id}
|
||||
isReadOnly
|
||||
@@ -100,29 +100,28 @@ function InventoryDetail({ inventory }) {
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
isEmpty={instanceGroups.length === 0}
|
||||
/>
|
||||
)}
|
||||
{inventory.summary_fields.labels && (
|
||||
<Detail
|
||||
fullWidth
|
||||
helpText={helpText.labels}
|
||||
label={t`Labels`}
|
||||
value={
|
||||
<ChipGroup
|
||||
numChips={5}
|
||||
totalChips={inventory.summary_fields.labels?.results?.length}
|
||||
>
|
||||
{inventory.summary_fields.labels?.results?.map((l) => (
|
||||
<Chip key={l.id} isReadOnly>
|
||||
{l.name}
|
||||
</Chip>
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
isEmpty={inventory.summary_fields.labels?.results?.length === 0}
|
||||
/>
|
||||
)}
|
||||
{inventory.summary_fields.labels &&
|
||||
inventory.summary_fields.labels?.results?.length > 0 && (
|
||||
<Detail
|
||||
fullWidth
|
||||
helpText={helpText.labels}
|
||||
label={t`Labels`}
|
||||
value={
|
||||
<ChipGroup
|
||||
numChips={5}
|
||||
totalChips={inventory.summary_fields.labels.results.length}
|
||||
>
|
||||
{inventory.summary_fields.labels.results.map((l) => (
|
||||
<Chip key={l.id} isReadOnly>
|
||||
{l.name}
|
||||
</Chip>
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
/>
|
||||
)}
|
||||
<VariablesDetail
|
||||
label={t`Variables`}
|
||||
helpText={helpText.variables()}
|
||||
|
||||
@@ -153,9 +153,6 @@ describe('<InventoryDetail />', () => {
|
||||
expect(InventoriesAPI.readInstanceGroups).toHaveBeenCalledWith(
|
||||
mockInventory.id
|
||||
);
|
||||
const instance_groups_detail = wrapper
|
||||
.find(`Detail[label="Instance Groups"]`)
|
||||
.at(0);
|
||||
expect(instance_groups_detail.prop('isEmpty')).toEqual(true);
|
||||
expect(wrapper.find(`Detail[label="Instance Groups"]`)).toHaveLength(0);
|
||||
});
|
||||
});
|
||||
|
||||
@@ -72,11 +72,12 @@ function InventoryHostDetail({ host }) {
|
||||
<HostToggle host={host} css="padding-bottom: 40px" />
|
||||
<DetailList gutter="sm">
|
||||
<Detail label={t`Name`} value={name} />
|
||||
<Detail
|
||||
label={t`Activity`}
|
||||
value={<Sparkline jobs={recentPlaybookJobs} />}
|
||||
isEmpty={recentPlaybookJobs?.length === 0}
|
||||
/>
|
||||
{recentPlaybookJobs?.length > 0 && (
|
||||
<Detail
|
||||
label={t`Activity`}
|
||||
value={<Sparkline jobs={recentPlaybookJobs} />}
|
||||
/>
|
||||
)}
|
||||
<Detail label={t`Description`} value={description} />
|
||||
<UserDateDetail date={created} label={t`Created`} user={created_by} />
|
||||
<UserDateDetail
|
||||
|
||||
@@ -91,8 +91,6 @@ describe('<InventoryHostDetail />', () => {
|
||||
expect(wrapper.find(`Detail[label="Activity"] Sparkline`)).toHaveLength(
|
||||
0
|
||||
);
|
||||
const activity_detail = wrapper.find(`Detail[label="Activity"]`).at(0);
|
||||
expect(activity_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
|
||||
test('should hide edit button for users without edit permission', async () => {
|
||||
|
||||
@@ -216,7 +216,7 @@ function InventoryList() {
|
||||
headerRow={
|
||||
<HeaderRow qsConfig={QS_CONFIG}>
|
||||
<HeaderCell sortKey="name">{t`Name`}</HeaderCell>
|
||||
<HeaderCell>{t`Sync Status`}</HeaderCell>
|
||||
<HeaderCell>{t`Status`}</HeaderCell>
|
||||
<HeaderCell>{t`Type`}</HeaderCell>
|
||||
<HeaderCell>{t`Organization`}</HeaderCell>
|
||||
<HeaderCell>{t`Actions`}</HeaderCell>
|
||||
|
||||
@@ -31,6 +31,7 @@ describe('<InventorySourceAdd />', () => {
|
||||
source_vars: '---↵',
|
||||
update_cache_timeout: 0,
|
||||
update_on_launch: false,
|
||||
update_on_project_update: false,
|
||||
verbosity: 1,
|
||||
};
|
||||
|
||||
|
||||
@@ -49,6 +49,7 @@ function InventorySourceDetail({ inventorySource }) {
|
||||
source_vars,
|
||||
update_cache_timeout,
|
||||
update_on_launch,
|
||||
update_on_project_update,
|
||||
verbosity,
|
||||
enabled_var,
|
||||
enabled_value,
|
||||
@@ -112,7 +113,12 @@ function InventorySourceDetail({ inventorySource }) {
|
||||
);
|
||||
|
||||
let optionsList = '';
|
||||
if (overwrite || overwrite_vars || update_on_launch) {
|
||||
if (
|
||||
overwrite ||
|
||||
overwrite_vars ||
|
||||
update_on_launch ||
|
||||
update_on_project_update
|
||||
) {
|
||||
optionsList = (
|
||||
<TextList component={TextListVariants.ul}>
|
||||
{overwrite && (
|
||||
@@ -137,6 +143,16 @@ function InventorySourceDetail({ inventorySource }) {
|
||||
/>
|
||||
</TextListItem>
|
||||
)}
|
||||
{update_on_project_update && (
|
||||
<TextListItem component={TextListItemVariants.li}>
|
||||
{t`Update on project update`}
|
||||
<Popover
|
||||
content={helpText.subFormOptions.updateOnProjectUpdate({
|
||||
value: source_project,
|
||||
})}
|
||||
/>
|
||||
</TextListItem>
|
||||
)}
|
||||
</TextList>
|
||||
);
|
||||
}
|
||||
@@ -252,14 +268,15 @@ function InventorySourceDetail({ inventorySource }) {
|
||||
helpText={helpText.enabledValue}
|
||||
value={enabled_value}
|
||||
/>
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Credential`}
|
||||
value={credentials?.map((cred) => (
|
||||
<CredentialChip key={cred?.id} credential={cred} isReadOnly />
|
||||
))}
|
||||
isEmpty={credentials?.length === 0}
|
||||
/>
|
||||
{credentials?.length > 0 && (
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Credential`}
|
||||
value={credentials.map((cred) => (
|
||||
<CredentialChip key={cred?.id} credential={cred} isReadOnly />
|
||||
))}
|
||||
/>
|
||||
)}
|
||||
{optionsList && (
|
||||
<Detail fullWidth label={t`Enabled Options`} value={optionsList} />
|
||||
)}
|
||||
|
||||
@@ -113,6 +113,7 @@ describe('InventorySourceDetail', () => {
|
||||
'Overwrite local groups and hosts from remote inventory source',
|
||||
'Overwrite local variables from remote inventory source',
|
||||
'Update on launch',
|
||||
'Update on project update',
|
||||
]).toContain(option.text());
|
||||
});
|
||||
});
|
||||
@@ -236,21 +237,4 @@ describe('InventorySourceDetail', () => {
|
||||
(el) => el.length === 0
|
||||
);
|
||||
});
|
||||
|
||||
test('should not load Credentials', async () => {
|
||||
await act(async () => {
|
||||
wrapper = mountWithContexts(
|
||||
<InventorySourceDetail
|
||||
inventorySource={{
|
||||
...mockInvSource,
|
||||
summary_fields: {
|
||||
credentials: [],
|
||||
},
|
||||
}}
|
||||
/>
|
||||
);
|
||||
});
|
||||
const credentials_detail = wrapper.find(`Detail[label="Credential"]`).at(0);
|
||||
expect(credentials_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
});
|
||||
|
||||
@@ -31,6 +31,7 @@ describe('<InventorySourceEdit />', () => {
|
||||
source_vars: '---↵',
|
||||
update_cache_timeout: 0,
|
||||
update_on_launch: false,
|
||||
update_on_project_update: false,
|
||||
verbosity: 1,
|
||||
};
|
||||
const mockInventory = {
|
||||
|
||||
@@ -96,11 +96,12 @@ function SmartInventoryDetail({ inventory }) {
|
||||
<CardBody>
|
||||
<DetailList>
|
||||
<Detail label={t`Name`} value={name} />
|
||||
<Detail
|
||||
label={t`Activity`}
|
||||
value={<Sparkline jobs={recentJobs} />}
|
||||
isEmpty={recentJobs.length === 0}
|
||||
/>
|
||||
{recentJobs.length > 0 && (
|
||||
<Detail
|
||||
label={t`Activity`}
|
||||
value={<Sparkline jobs={recentJobs} />}
|
||||
/>
|
||||
)}
|
||||
<Detail label={t`Description`} value={description} />
|
||||
<Detail label={t`Type`} value={t`Smart inventory`} />
|
||||
<Detail
|
||||
@@ -117,28 +118,29 @@ function SmartInventoryDetail({ inventory }) {
|
||||
value={<Label variant="outline">{host_filter}</Label>}
|
||||
/>
|
||||
<Detail label={t`Total hosts`} value={total_hosts} />
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Instance groups`}
|
||||
value={
|
||||
<ChipGroup
|
||||
numChips={5}
|
||||
totalChips={instanceGroups.length}
|
||||
ouiaId="instance-group-chips"
|
||||
>
|
||||
{instanceGroups.map((ig) => (
|
||||
<Chip
|
||||
key={ig.id}
|
||||
isReadOnly
|
||||
ouiaId={`instance-group-${ig.id}-chip`}
|
||||
>
|
||||
{ig.name}
|
||||
</Chip>
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
isEmpty={instanceGroups.length === 0}
|
||||
/>
|
||||
{instanceGroups.length > 0 && (
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Instance groups`}
|
||||
value={
|
||||
<ChipGroup
|
||||
numChips={5}
|
||||
totalChips={instanceGroups.length}
|
||||
ouiaId="instance-group-chips"
|
||||
>
|
||||
{instanceGroups.map((ig) => (
|
||||
<Chip
|
||||
key={ig.id}
|
||||
isReadOnly
|
||||
ouiaId={`instance-group-${ig.id}-chip`}
|
||||
>
|
||||
{ig.name}
|
||||
</Chip>
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
/>
|
||||
)}
|
||||
<VariablesDetail
|
||||
label={t`Variables`}
|
||||
value={variables}
|
||||
|
||||
@@ -112,41 +112,6 @@ describe('<SmartInventoryDetail />', () => {
|
||||
(el) => el.length === 0
|
||||
);
|
||||
});
|
||||
|
||||
test('should not load Activity', async () => {
|
||||
await act(async () => {
|
||||
wrapper = mountWithContexts(
|
||||
<SmartInventoryDetail
|
||||
inventory={{
|
||||
...mockSmartInventory,
|
||||
recent_jobs: [],
|
||||
}}
|
||||
/>
|
||||
);
|
||||
});
|
||||
const activity_detail = wrapper.find(`Detail[label="Activity"]`).at(0);
|
||||
expect(activity_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
|
||||
test('should not load Instance Groups', async () => {
|
||||
InventoriesAPI.readInstanceGroups.mockResolvedValue({
|
||||
data: {
|
||||
results: [],
|
||||
},
|
||||
});
|
||||
|
||||
let wrapper;
|
||||
await act(async () => {
|
||||
wrapper = mountWithContexts(
|
||||
<SmartInventoryDetail inventory={mockSmartInventory} />
|
||||
);
|
||||
});
|
||||
wrapper.update();
|
||||
const instance_groups_detail = wrapper
|
||||
.find(`Detail[label="Instance groups"]`)
|
||||
.at(0);
|
||||
expect(instance_groups_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
});
|
||||
|
||||
describe('User has read-only permissions', () => {
|
||||
|
||||
@@ -28,11 +28,12 @@ function SmartInventoryHostDetail({ host }) {
|
||||
<CardBody>
|
||||
<DetailList gutter="sm">
|
||||
<Detail label={t`Name`} value={name} />
|
||||
<Detail
|
||||
label={t`Activity`}
|
||||
value={<Sparkline jobs={recentPlaybookJobs} />}
|
||||
isEmpty={recentPlaybookJobs?.length === 0}
|
||||
/>
|
||||
{recentPlaybookJobs?.length > 0 && (
|
||||
<Detail
|
||||
label={t`Activity`}
|
||||
value={<Sparkline jobs={recentPlaybookJobs} />}
|
||||
/>
|
||||
)}
|
||||
<Detail label={t`Description`} value={description} />
|
||||
<Detail
|
||||
label={t`Inventory`}
|
||||
|
||||
@@ -27,19 +27,4 @@ describe('<SmartInventoryHostDetail />', () => {
|
||||
expect(wrapper.find('Detail[label="Activity"] Sparkline')).toHaveLength(1);
|
||||
expect(wrapper.find('VariablesDetail')).toHaveLength(1);
|
||||
});
|
||||
|
||||
test('should not load Activity', () => {
|
||||
wrapper = mountWithContexts(
|
||||
<SmartInventoryHostDetail
|
||||
host={{
|
||||
...mockHost,
|
||||
summary_fields: {
|
||||
recent_jobs: [],
|
||||
},
|
||||
}}
|
||||
/>
|
||||
);
|
||||
const activity_detail = wrapper.find(`Detail[label="Activity"]`).at(0);
|
||||
expect(activity_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
});
|
||||
|
||||
@@ -73,6 +73,7 @@ const InventorySourceFormFields = ({
|
||||
source_vars: '---\n',
|
||||
update_cache_timeout: 0,
|
||||
update_on_launch: false,
|
||||
update_on_project_update: false,
|
||||
verbosity: 1,
|
||||
enabled_var: '',
|
||||
enabled_value: '',
|
||||
@@ -250,6 +251,7 @@ const InventorySourceForm = ({
|
||||
source_vars: source?.source_vars || '---\n',
|
||||
update_cache_timeout: source?.update_cache_timeout || 0,
|
||||
update_on_launch: source?.update_on_launch || false,
|
||||
update_on_project_update: source?.update_on_project_update || false,
|
||||
verbosity: source?.verbosity || 1,
|
||||
enabled_var: source?.enabled_var || '',
|
||||
enabled_value: source?.enabled_value || '',
|
||||
|
||||
@@ -17,6 +17,7 @@ const initialValues = {
|
||||
source_vars: '---\n',
|
||||
update_cache_timeout: 0,
|
||||
update_on_launch: true,
|
||||
update_on_project_update: false,
|
||||
verbosity: 1,
|
||||
};
|
||||
|
||||
|
||||
@@ -17,6 +17,7 @@ const initialValues = {
|
||||
source_vars: '---\n',
|
||||
update_cache_timeout: 0,
|
||||
update_on_launch: true,
|
||||
update_on_project_update: false,
|
||||
verbosity: 1,
|
||||
};
|
||||
|
||||
|
||||
@@ -17,6 +17,7 @@ const initialValues = {
|
||||
source_vars: '---\n',
|
||||
update_cache_timeout: 0,
|
||||
update_on_launch: true,
|
||||
update_on_project_update: false,
|
||||
verbosity: 1,
|
||||
};
|
||||
|
||||
|
||||
@@ -17,6 +17,7 @@ const initialValues = {
|
||||
source_vars: '---\n',
|
||||
update_cache_timeout: 0,
|
||||
update_on_launch: true,
|
||||
update_on_project_update: false,
|
||||
verbosity: 1,
|
||||
};
|
||||
|
||||
|
||||
@@ -17,6 +17,7 @@ const initialValues = {
|
||||
source_vars: '---\n',
|
||||
update_cache_timeout: 0,
|
||||
update_on_launch: true,
|
||||
update_on_project_update: false,
|
||||
verbosity: 1,
|
||||
};
|
||||
|
||||
|
||||
@@ -17,6 +17,7 @@ const initialValues = {
|
||||
source_vars: '---\n',
|
||||
update_cache_timeout: 0,
|
||||
update_on_launch: true,
|
||||
update_on_project_update: false,
|
||||
verbosity: 1,
|
||||
};
|
||||
|
||||
|
||||
@@ -17,6 +17,7 @@ const initialValues = {
|
||||
source_vars: '---\n',
|
||||
update_cache_timeout: 0,
|
||||
update_on_launch: true,
|
||||
update_on_project_update: false,
|
||||
verbosity: 1,
|
||||
};
|
||||
|
||||
@@ -120,6 +121,7 @@ describe('<SCMSubForm />', () => {
|
||||
source_vars: '---\n',
|
||||
update_cache_timeout: 0,
|
||||
update_on_launch: true,
|
||||
update_on_project_update: false,
|
||||
verbosity: 1,
|
||||
};
|
||||
let customWrapper;
|
||||
@@ -137,4 +139,63 @@ describe('<SCMSubForm />', () => {
|
||||
customWrapper.update();
|
||||
expect(customWrapper.find('Select').prop('selections')).toBe('newPath');
|
||||
});
|
||||
test('Update on project update should be disabled', async () => {
|
||||
const customInitialValues = {
|
||||
credential: { id: 1, name: 'Credential' },
|
||||
custom_virtualenv: '',
|
||||
overwrite: false,
|
||||
overwrite_vars: false,
|
||||
source_path: '/path',
|
||||
source_project: { id: 1, name: 'Source project' },
|
||||
source_script: null,
|
||||
source_vars: '---\n',
|
||||
update_cache_timeout: 0,
|
||||
update_on_launch: true,
|
||||
update_on_project_update: false,
|
||||
verbosity: 1,
|
||||
};
|
||||
let customWrapper;
|
||||
await act(async () => {
|
||||
customWrapper = mountWithContexts(
|
||||
<Formik initialValues={customInitialValues}>
|
||||
<SCMSubForm />
|
||||
</Formik>
|
||||
);
|
||||
});
|
||||
expect(
|
||||
customWrapper
|
||||
.find('Checkbox[aria-label="Update on project update"]')
|
||||
.prop('isDisabled')
|
||||
).toBe(true);
|
||||
});
|
||||
|
||||
test('Update on launch should be disabled', async () => {
|
||||
const customInitialValues = {
|
||||
credential: { id: 1, name: 'Credential' },
|
||||
custom_virtualenv: '',
|
||||
overwrite: false,
|
||||
overwrite_vars: false,
|
||||
source_path: '/path',
|
||||
source_project: { id: 1, name: 'Source project' },
|
||||
source_script: null,
|
||||
source_vars: '---\n',
|
||||
update_cache_timeout: 0,
|
||||
update_on_launch: false,
|
||||
update_on_project_update: true,
|
||||
verbosity: 1,
|
||||
};
|
||||
let customWrapper;
|
||||
await act(async () => {
|
||||
customWrapper = mountWithContexts(
|
||||
<Formik initialValues={customInitialValues}>
|
||||
<SCMSubForm />
|
||||
</Formik>
|
||||
);
|
||||
});
|
||||
expect(
|
||||
customWrapper
|
||||
.find('Checkbox[aria-label="Update on launch"]')
|
||||
.prop('isDisabled')
|
||||
).toBe(true);
|
||||
});
|
||||
});
|
||||
|
||||
@@ -17,6 +17,7 @@ const initialValues = {
|
||||
source_vars: '---\n',
|
||||
update_cache_timeout: 0,
|
||||
update_on_launch: true,
|
||||
update_on_project_update: false,
|
||||
verbosity: 1,
|
||||
};
|
||||
|
||||
|
||||
@@ -53,10 +53,11 @@ export const VerbosityField = () => {
|
||||
);
|
||||
};
|
||||
|
||||
export const OptionsField = () => {
|
||||
export const OptionsField = ({ showProjectUpdate = false }) => {
|
||||
const [updateOnLaunchField] = useField('update_on_launch');
|
||||
const [, , updateCacheTimeoutHelper] = useField('update_cache_timeout');
|
||||
const [projectField] = useField('source_project');
|
||||
const [updatedOnProjectUpdateField] = useField('update_on_project_update');
|
||||
|
||||
useEffect(() => {
|
||||
if (!updateOnLaunchField.value) {
|
||||
@@ -82,11 +83,23 @@ export const OptionsField = () => {
|
||||
tooltip={helpText.subFormOptions.overwriteVariables}
|
||||
/>
|
||||
<CheckboxField
|
||||
isDisabled={updatedOnProjectUpdateField.value}
|
||||
id="update_on_launch"
|
||||
name="update_on_launch"
|
||||
label={t`Update on launch`}
|
||||
tooltip={helpText.subFormOptions.updateOnLaunch(projectField)}
|
||||
/>
|
||||
{showProjectUpdate && (
|
||||
<CheckboxField
|
||||
isDisabled={updateOnLaunchField.value}
|
||||
id="update_on_project_update"
|
||||
name="update_on_project_update"
|
||||
label={t`Update on project update`}
|
||||
tooltip={helpText.subFormOptions.updateOnProjectUpdate(
|
||||
projectField
|
||||
)}
|
||||
/>
|
||||
)}
|
||||
</FormCheckboxLayout>
|
||||
</FormGroup>
|
||||
</FormFullWidthLayout>
|
||||
|
||||
@@ -17,6 +17,7 @@ const initialValues = {
|
||||
source_vars: '---\n',
|
||||
update_cache_timeout: 0,
|
||||
update_on_launch: true,
|
||||
update_on_project_update: false,
|
||||
verbosity: 1,
|
||||
};
|
||||
|
||||
|
||||
@@ -17,6 +17,7 @@ const initialValues = {
|
||||
source_vars: '---\n',
|
||||
update_cache_timeout: 0,
|
||||
update_on_launch: true,
|
||||
update_on_project_update: false,
|
||||
verbosity: 1,
|
||||
};
|
||||
|
||||
|
||||
@@ -115,6 +115,7 @@
|
||||
"update_on_launch":true,
|
||||
"update_cache_timeout":2,
|
||||
"source_project":8,
|
||||
"update_on_project_update":true,
|
||||
"last_update_failed": true,
|
||||
"last_updated":null,
|
||||
"execution_environment": 1
|
||||
|
||||
@@ -268,14 +268,15 @@ function JobDetail({ job, inventorySourceLabels }) {
|
||||
</Link>
|
||||
}
|
||||
/>
|
||||
<Detail
|
||||
dataCy="job-inventory-source-type"
|
||||
label={t`Source`}
|
||||
value={inventorySourceLabels.map(([string, label]) =>
|
||||
string === job.source ? label : null
|
||||
)}
|
||||
isEmpty={inventorySourceLabels.length === 0}
|
||||
/>
|
||||
{inventorySourceLabels.length > 0 && (
|
||||
<Detail
|
||||
dataCy="job-inventory-source-type"
|
||||
label={t`Source`}
|
||||
value={inventorySourceLabels.map(([string, label]) =>
|
||||
string === job.source ? label : null
|
||||
)}
|
||||
/>
|
||||
)}
|
||||
</>
|
||||
)}
|
||||
{inventory_source && inventory_source.source === 'scm' && (
|
||||
@@ -405,7 +406,7 @@ function JobDetail({ job, inventorySourceLabels }) {
|
||||
}
|
||||
/>
|
||||
)}
|
||||
{credentials && (
|
||||
{credentials && credentials.length > 0 && (
|
||||
<Detail
|
||||
dataCy="job-credentials"
|
||||
fullWidth
|
||||
@@ -427,7 +428,6 @@ function JobDetail({ job, inventorySourceLabels }) {
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
isEmpty={credentials.length === 0}
|
||||
/>
|
||||
)}
|
||||
{labels && labels.count > 0 && (
|
||||
@@ -451,7 +451,7 @@ function JobDetail({ job, inventorySourceLabels }) {
|
||||
}
|
||||
/>
|
||||
)}
|
||||
{job.job_tags && (
|
||||
{job.job_tags && job.job_tags.length > 0 && (
|
||||
<Detail
|
||||
dataCy="job-tags"
|
||||
fullWidth
|
||||
@@ -474,10 +474,9 @@ function JobDetail({ job, inventorySourceLabels }) {
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
isEmpty={job.job_tags.length === 0}
|
||||
/>
|
||||
)}
|
||||
{job.skip_tags && (
|
||||
{job.skip_tags && job.skip_tags.length > 0 && (
|
||||
<Detail
|
||||
dataCy="job-skip-tags"
|
||||
fullWidth
|
||||
@@ -500,7 +499,6 @@ function JobDetail({ job, inventorySourceLabels }) {
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
isEmpty={job.skip_tags.length === 0}
|
||||
/>
|
||||
)}
|
||||
<Detail
|
||||
|
||||
@@ -548,64 +548,4 @@ describe('<JobDetail />', () => {
|
||||
assertDetail('Inventory', 'Demo Inventory');
|
||||
assertDetail('Job Slice Parent', 'True');
|
||||
});
|
||||
|
||||
test('should not load Source', () => {
|
||||
wrapper = mountWithContexts(
|
||||
<JobDetail
|
||||
job={{
|
||||
...mockJobData,
|
||||
summary_fields: {
|
||||
inventory_source: {},
|
||||
user_capabilities: {},
|
||||
inventory: { id: 1 },
|
||||
},
|
||||
}}
|
||||
inventorySourceLabels={[]}
|
||||
/>
|
||||
);
|
||||
const source_detail = wrapper.find(`Detail[label="Source"]`).at(0);
|
||||
expect(source_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
|
||||
test('should not load Credentials', () => {
|
||||
wrapper = mountWithContexts(
|
||||
<JobDetail
|
||||
job={{
|
||||
...mockJobData,
|
||||
summary_fields: {
|
||||
user_capabilities: {},
|
||||
credentials: [],
|
||||
},
|
||||
}}
|
||||
/>
|
||||
);
|
||||
const credentials_detail = wrapper
|
||||
.find(`Detail[label="Credentials"]`)
|
||||
.at(0);
|
||||
expect(credentials_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
|
||||
test('should not load Job Tags', () => {
|
||||
wrapper = mountWithContexts(
|
||||
<JobDetail
|
||||
job={{
|
||||
...mockJobData,
|
||||
job_tags: '',
|
||||
}}
|
||||
/>
|
||||
);
|
||||
expect(wrapper.find('Detail[label="Job Tags"]').length).toBe(0);
|
||||
});
|
||||
|
||||
test('should not load Skip Tags', () => {
|
||||
wrapper = mountWithContexts(
|
||||
<JobDetail
|
||||
job={{
|
||||
...mockJobData,
|
||||
skip_tags: '',
|
||||
}}
|
||||
/>
|
||||
);
|
||||
expect(wrapper.find('Detail[label="Skip Tags"]').length).toBe(0);
|
||||
});
|
||||
});
|
||||
|
||||
@@ -163,11 +163,6 @@ function JobOutput({ job, eventRelatedSearchableKeys, eventSearchableKeys }) {
|
||||
}
|
||||
addEvents(onReadyEvents);
|
||||
setOnReadyEvents([]);
|
||||
if (isFollowModeEnabled) {
|
||||
setTimeout(() => {
|
||||
scrollToEnd();
|
||||
}, 0);
|
||||
}
|
||||
}, [isTreeReady, onReadyEvents]); // eslint-disable-line react-hooks/exhaustive-deps
|
||||
|
||||
const totalNonCollapsedRows = Math.max(
|
||||
@@ -193,6 +188,9 @@ function JobOutput({ job, eventRelatedSearchableKeys, eventSearchableKeys }) {
|
||||
}, [location.search]); // eslint-disable-line react-hooks/exhaustive-deps
|
||||
|
||||
useEffect(() => {
|
||||
if (!isJobRunning(jobStatus)) {
|
||||
setIsFollowModeEnabled(false);
|
||||
}
|
||||
rebuildEventsTree();
|
||||
}, [isFlatMode]); // eslint-disable-line react-hooks/exhaustive-deps
|
||||
|
||||
@@ -244,7 +242,7 @@ function JobOutput({ job, eventRelatedSearchableKeys, eventSearchableKeys }) {
|
||||
if (data.group_name === `${job.type}_events`) {
|
||||
batchedEvents.push(data);
|
||||
clearTimeout(batchTimeout);
|
||||
if (batchedEvents.length >= 10) {
|
||||
if (batchedEvents.length >= 25) {
|
||||
addBatchedEvents();
|
||||
} else {
|
||||
batchTimeout = setTimeout(addBatchedEvents, 500);
|
||||
@@ -270,12 +268,6 @@ function JobOutput({ job, eventRelatedSearchableKeys, eventSearchableKeys }) {
|
||||
};
|
||||
}, [isJobRunning(jobStatus)]); // eslint-disable-line react-hooks/exhaustive-deps
|
||||
|
||||
useEffect(() => {
|
||||
if (isFollowModeEnabled) {
|
||||
scrollToEnd();
|
||||
}
|
||||
}, [wsEvents.length, isFollowModeEnabled]); // eslint-disable-line react-hooks/exhaustive-deps
|
||||
|
||||
useEffect(() => {
|
||||
if (listRef.current?.recomputeRowHeights) {
|
||||
listRef.current.recomputeRowHeights();
|
||||
@@ -427,6 +419,9 @@ function JobOutput({ job, eventRelatedSearchableKeys, eventSearchableKeys }) {
|
||||
};
|
||||
|
||||
const rowRenderer = ({ index, parent, key, style }) => {
|
||||
if (listRef.current && isFollowModeEnabled) {
|
||||
setTimeout(() => scrollToRow(remoteRowCount - 1), 0);
|
||||
}
|
||||
let event;
|
||||
let node;
|
||||
try {
|
||||
@@ -573,9 +568,6 @@ function JobOutput({ job, eventRelatedSearchableKeys, eventSearchableKeys }) {
|
||||
loadRange.forEach((n) => {
|
||||
cache.clear(n);
|
||||
});
|
||||
if (isFollowModeEnabled) {
|
||||
scrollToEnd();
|
||||
}
|
||||
};
|
||||
|
||||
const scrollToRow = (rowIndex) => {
|
||||
@@ -588,16 +580,14 @@ function JobOutput({ job, eventRelatedSearchableKeys, eventSearchableKeys }) {
|
||||
const handleScrollPrevious = () => {
|
||||
const startIndex = listRef.current.Grid._renderedRowStartIndex;
|
||||
const stopIndex = listRef.current.Grid._renderedRowStopIndex;
|
||||
const scrollRange = stopIndex - startIndex;
|
||||
const scrollRange = stopIndex - startIndex + 1;
|
||||
scrollToRow(Math.max(0, startIndex - scrollRange));
|
||||
setIsFollowModeEnabled(false);
|
||||
};
|
||||
|
||||
const handleScrollNext = () => {
|
||||
const startIndex = listRef.current.Grid._renderedRowStartIndex;
|
||||
const stopIndex = listRef.current.Grid._renderedRowStopIndex;
|
||||
const scrollRange = stopIndex - startIndex;
|
||||
scrollToRow(stopIndex + scrollRange);
|
||||
scrollToRow(stopIndex - 1);
|
||||
};
|
||||
|
||||
const handleScrollFirst = () => {
|
||||
@@ -605,14 +595,8 @@ function JobOutput({ job, eventRelatedSearchableKeys, eventSearchableKeys }) {
|
||||
setIsFollowModeEnabled(false);
|
||||
};
|
||||
|
||||
const scrollToEnd = () => {
|
||||
scrollToRow(-1);
|
||||
setTimeout(() => scrollToRow(-1), 100);
|
||||
};
|
||||
|
||||
const handleScrollLast = () => {
|
||||
scrollToEnd();
|
||||
setIsFollowModeEnabled(true);
|
||||
scrollToRow(totalNonCollapsedRows + wsEvents.length);
|
||||
};
|
||||
|
||||
const handleResize = ({ width }) => {
|
||||
@@ -635,9 +619,6 @@ function JobOutput({ job, eventRelatedSearchableKeys, eventSearchableKeys }) {
|
||||
}
|
||||
scrollTop.current = e.scrollTop;
|
||||
scrollHeight.current = e.scrollHeight;
|
||||
if (e.scrollTop + e.clientHeight >= e.scrollHeight) {
|
||||
setIsFollowModeEnabled(true);
|
||||
}
|
||||
};
|
||||
|
||||
const handleExpandCollapseAll = () => {
|
||||
@@ -677,7 +658,8 @@ function JobOutput({ job, eventRelatedSearchableKeys, eventSearchableKeys }) {
|
||||
job={job}
|
||||
eventRelatedSearchableKeys={eventRelatedSearchableKeys}
|
||||
eventSearchableKeys={eventSearchableKeys}
|
||||
scrollToEnd={scrollToEnd}
|
||||
remoteRowCount={remoteRowCount}
|
||||
scrollToRow={scrollToRow}
|
||||
isFollowModeEnabled={isFollowModeEnabled}
|
||||
setIsFollowModeEnabled={setIsFollowModeEnabled}
|
||||
/>
|
||||
@@ -736,6 +718,7 @@ function JobOutput({ job, eventRelatedSearchableKeys, eventSearchableKeys }) {
|
||||
rowCount={totalNonCollapsedRows + wsEvents.length}
|
||||
rowHeight={cache.rowHeight}
|
||||
rowRenderer={rowRenderer}
|
||||
scrollToAlignment="start"
|
||||
width={width || 1}
|
||||
overscanRowCount={20}
|
||||
onScroll={handleScroll}
|
||||
|
||||
@@ -30,7 +30,8 @@ function JobOutputSearch({
|
||||
job,
|
||||
eventRelatedSearchableKeys,
|
||||
eventSearchableKeys,
|
||||
scrollToEnd,
|
||||
remoteRowCount,
|
||||
scrollToRow,
|
||||
isFollowModeEnabled,
|
||||
setIsFollowModeEnabled,
|
||||
}) {
|
||||
@@ -82,7 +83,7 @@ function JobOutputSearch({
|
||||
setIsFollowModeEnabled(false);
|
||||
} else {
|
||||
setIsFollowModeEnabled(true);
|
||||
scrollToEnd();
|
||||
scrollToRow(remoteRowCount - 1);
|
||||
}
|
||||
};
|
||||
|
||||
|
||||
@@ -30,7 +30,7 @@ function OrganizationDetail({ organization }) {
|
||||
created,
|
||||
modified,
|
||||
summary_fields,
|
||||
galaxy_credentials = [],
|
||||
galaxy_credentials,
|
||||
} = organization;
|
||||
const [contentError, setContentError] = useState(null);
|
||||
const [hasContentLoading, setHasContentLoading] = useState(true);
|
||||
@@ -121,7 +121,7 @@ function OrganizationDetail({ organization }) {
|
||||
date={modified}
|
||||
user={summary_fields.modified_by}
|
||||
/>
|
||||
{instanceGroups && (
|
||||
{instanceGroups && instanceGroups.length > 0 && (
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Instance Groups`}
|
||||
@@ -145,35 +145,35 @@ function OrganizationDetail({ organization }) {
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
isEmpty={instanceGroups.length === 0}
|
||||
/>
|
||||
)}
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Galaxy Credentials`}
|
||||
value={
|
||||
<ChipGroup
|
||||
numChips={5}
|
||||
totalChips={galaxy_credentials?.length}
|
||||
ouiaId="galaxy-credential-chips"
|
||||
>
|
||||
{galaxy_credentials?.map((credential) => (
|
||||
<Link
|
||||
key={credential.id}
|
||||
to={`/credentials/${credential.id}/details`}
|
||||
>
|
||||
<CredentialChip
|
||||
credential={credential}
|
||||
{galaxy_credentials && galaxy_credentials.length > 0 && (
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Galaxy Credentials`}
|
||||
value={
|
||||
<ChipGroup
|
||||
numChips={5}
|
||||
totalChips={galaxy_credentials.length}
|
||||
ouiaId="galaxy-credential-chips"
|
||||
>
|
||||
{galaxy_credentials.map((credential) => (
|
||||
<Link
|
||||
key={credential.id}
|
||||
isReadOnly
|
||||
ouiaId={`galaxy-credential-${credential.id}-chip`}
|
||||
/>
|
||||
</Link>
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
isEmpty={galaxy_credentials?.length === 0}
|
||||
/>
|
||||
to={`/credentials/${credential.id}/details`}
|
||||
>
|
||||
<CredentialChip
|
||||
credential={credential}
|
||||
key={credential.id}
|
||||
isReadOnly
|
||||
ouiaId={`galaxy-credential-${credential.id}-chip`}
|
||||
/>
|
||||
</Link>
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
/>
|
||||
)}
|
||||
</DetailList>
|
||||
<CardActionsRow>
|
||||
{summary_fields.user_capabilities.edit && (
|
||||
|
||||
@@ -216,44 +216,4 @@ describe('<OrganizationDetail />', () => {
|
||||
(el) => el.length === 0
|
||||
);
|
||||
});
|
||||
|
||||
test('should not load instance groups', async () => {
|
||||
OrganizationsAPI.readInstanceGroups.mockResolvedValue({
|
||||
data: {
|
||||
results: [],
|
||||
},
|
||||
});
|
||||
|
||||
let wrapper;
|
||||
await act(async () => {
|
||||
wrapper = mountWithContexts(
|
||||
<OrganizationDetail organization={mockOrganization} />
|
||||
);
|
||||
});
|
||||
wrapper.update();
|
||||
const instance_groups_detail = wrapper
|
||||
.find(`Detail[label="Instance Groups"]`)
|
||||
.at(0);
|
||||
expect(instance_groups_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
|
||||
test('should not load galaxy credentials', async () => {
|
||||
OrganizationsAPI.readInstanceGroups.mockResolvedValue({ data: {} });
|
||||
let wrapper;
|
||||
await act(async () => {
|
||||
wrapper = mountWithContexts(
|
||||
<OrganizationDetail
|
||||
organization={{
|
||||
...mockOrganization,
|
||||
credential: [],
|
||||
}}
|
||||
/>
|
||||
);
|
||||
});
|
||||
wrapper.update();
|
||||
const galaxy_credentials_detail = wrapper
|
||||
.find(`Detail[label="Galaxy Credentials"]`)
|
||||
.at(0);
|
||||
expect(galaxy_credentials_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
});
|
||||
|
||||
@@ -354,7 +354,7 @@ function JobTemplateDetail({ template }) {
|
||||
helpText={helpText.enabledOptions}
|
||||
/>
|
||||
)}
|
||||
{summary_fields.credentials && (
|
||||
{summary_fields.credentials && summary_fields.credentials.length > 0 && (
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Credentials`}
|
||||
@@ -378,10 +378,9 @@ function JobTemplateDetail({ template }) {
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
isEmpty={summary_fields.credentials.length === 0}
|
||||
/>
|
||||
)}
|
||||
{summary_fields.labels && (
|
||||
{summary_fields.labels && summary_fields.labels.results.length > 0 && (
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Labels`}
|
||||
@@ -400,36 +399,36 @@ function JobTemplateDetail({ template }) {
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
isEmpty={summary_fields.labels.results.length === 0}
|
||||
/>
|
||||
)}
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Instance Groups`}
|
||||
dataCy="jt-detail-instance-groups"
|
||||
helpText={helpText.instanceGroups}
|
||||
value={
|
||||
<ChipGroup
|
||||
numChips={5}
|
||||
totalChips={instanceGroups.length}
|
||||
ouiaId="instance-group-chips"
|
||||
>
|
||||
{instanceGroups.map((ig) => (
|
||||
<Link to={`${buildLinkURL(ig)}${ig.id}/details`} key={ig.id}>
|
||||
<Chip
|
||||
key={ig.id}
|
||||
ouiaId={`instance-group-${ig.id}-chip`}
|
||||
isReadOnly
|
||||
>
|
||||
{ig.name}
|
||||
</Chip>
|
||||
</Link>
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
isEmpty={instanceGroups.length === 0}
|
||||
/>
|
||||
{job_tags && (
|
||||
{instanceGroups.length > 0 && (
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Instance Groups`}
|
||||
dataCy="jt-detail-instance-groups"
|
||||
helpText={helpText.instanceGroups}
|
||||
value={
|
||||
<ChipGroup
|
||||
numChips={5}
|
||||
totalChips={instanceGroups.length}
|
||||
ouiaId="instance-group-chips"
|
||||
>
|
||||
{instanceGroups.map((ig) => (
|
||||
<Link to={`${buildLinkURL(ig)}${ig.id}/details`} key={ig.id}>
|
||||
<Chip
|
||||
key={ig.id}
|
||||
ouiaId={`instance-group-${ig.id}-chip`}
|
||||
isReadOnly
|
||||
>
|
||||
{ig.name}
|
||||
</Chip>
|
||||
</Link>
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
/>
|
||||
)}
|
||||
{job_tags && job_tags.length > 0 && (
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Job Tags`}
|
||||
@@ -452,10 +451,9 @@ function JobTemplateDetail({ template }) {
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
isEmpty={job_tags.length === 0}
|
||||
/>
|
||||
)}
|
||||
{skip_tags && (
|
||||
{skip_tags && skip_tags.length > 0 && (
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Skip Tags`}
|
||||
@@ -478,7 +476,6 @@ function JobTemplateDetail({ template }) {
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
isEmpty={skip_tags.length === 0}
|
||||
/>
|
||||
)}
|
||||
<VariablesDetail
|
||||
|
||||
@@ -195,94 +195,4 @@ describe('<JobTemplateDetail />', () => {
|
||||
wrapper.find(`Detail[label="Execution Environment"] dd`).text()
|
||||
).toBe('Default EE');
|
||||
});
|
||||
|
||||
test('should not load credentials', async () => {
|
||||
await act(async () => {
|
||||
wrapper = mountWithContexts(
|
||||
<JobTemplateDetail
|
||||
template={{
|
||||
...mockTemplate,
|
||||
allow_simultaneous: true,
|
||||
ask_inventory_on_launch: true,
|
||||
summary_fields: {
|
||||
credentials: [],
|
||||
},
|
||||
}}
|
||||
/>
|
||||
);
|
||||
});
|
||||
const credentials_detail = wrapper
|
||||
.find(`Detail[label="Credentials"]`)
|
||||
.at(0);
|
||||
expect(credentials_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
|
||||
test('should not load labels', async () => {
|
||||
await act(async () => {
|
||||
wrapper = mountWithContexts(
|
||||
<JobTemplateDetail
|
||||
template={{
|
||||
...mockTemplate,
|
||||
allow_simultaneous: true,
|
||||
ask_inventory_on_launch: true,
|
||||
summary_fields: {
|
||||
labels: {
|
||||
results: [],
|
||||
},
|
||||
},
|
||||
}}
|
||||
/>
|
||||
);
|
||||
});
|
||||
const labels_detail = wrapper.find(`Detail[label="Labels"]`).at(0);
|
||||
expect(labels_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
|
||||
test('should not load instance groups', async () => {
|
||||
JobTemplatesAPI.readInstanceGroups.mockResolvedValue({
|
||||
data: {
|
||||
results: [],
|
||||
},
|
||||
});
|
||||
|
||||
let wrapper;
|
||||
await act(async () => {
|
||||
wrapper = mountWithContexts(
|
||||
<JobTemplateDetail template={mockTemplate} />
|
||||
);
|
||||
});
|
||||
wrapper.update();
|
||||
const instance_groups_detail = wrapper
|
||||
.find(`Detail[label="Instance Groups"]`)
|
||||
.at(0);
|
||||
expect(instance_groups_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
|
||||
test('should not load job tags', async () => {
|
||||
await act(async () => {
|
||||
wrapper = mountWithContexts(
|
||||
<JobTemplateDetail
|
||||
template={{
|
||||
...mockTemplate,
|
||||
job_tags: '',
|
||||
}}
|
||||
/>
|
||||
);
|
||||
});
|
||||
expect(wrapper.find('Detail[label="Job Tags"]').length).toBe(0);
|
||||
});
|
||||
|
||||
test('should not load skip tags', async () => {
|
||||
await act(async () => {
|
||||
wrapper = mountWithContexts(
|
||||
<JobTemplateDetail
|
||||
template={{
|
||||
...mockTemplate,
|
||||
skip_tags: '',
|
||||
}}
|
||||
/>
|
||||
);
|
||||
});
|
||||
expect(wrapper.find('Detail[label="Skip Tags"]').length).toBe(0);
|
||||
});
|
||||
});
|
||||
|
||||
@@ -110,11 +110,12 @@ function WorkflowJobTemplateDetail({ template }) {
|
||||
<DetailList gutter="sm">
|
||||
<Detail label={t`Name`} value={name} dataCy="jt-detail-name" />
|
||||
<Detail label={t`Description`} value={description} />
|
||||
<Detail
|
||||
value={<Sparkline jobs={recentPlaybookJobs} />}
|
||||
label={t`Activity`}
|
||||
isEmpty={summary_fields.recent_jobs?.length === 0}
|
||||
/>
|
||||
{summary_fields.recent_jobs?.length > 0 && (
|
||||
<Detail
|
||||
value={<Sparkline jobs={recentPlaybookJobs} />}
|
||||
label={t`Activity`}
|
||||
/>
|
||||
)}
|
||||
{summary_fields.organization && (
|
||||
<Detail
|
||||
label={t`Organization`}
|
||||
@@ -201,25 +202,26 @@ function WorkflowJobTemplateDetail({ template }) {
|
||||
helpText={helpText.enabledOptions}
|
||||
/>
|
||||
)}
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Labels`}
|
||||
helpText={helpText.labels}
|
||||
value={
|
||||
<ChipGroup
|
||||
numChips={3}
|
||||
totalChips={summary_fields.labels.results.length}
|
||||
ouiaId="workflow-job-template-detail-label-chips"
|
||||
>
|
||||
{summary_fields.labels.results.map((l) => (
|
||||
<Chip key={l.id} ouiaId={`${l.name}-label-chip`} isReadOnly>
|
||||
{l.name}
|
||||
</Chip>
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
isEmpty={!summary_fields.labels?.results?.length}
|
||||
/>
|
||||
{summary_fields.labels?.results?.length > 0 && (
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Labels`}
|
||||
helpText={helpText.labels}
|
||||
value={
|
||||
<ChipGroup
|
||||
numChips={3}
|
||||
totalChips={summary_fields.labels.results.length}
|
||||
ouiaId="workflow-job-template-detail-label-chips"
|
||||
>
|
||||
{summary_fields.labels.results.map((l) => (
|
||||
<Chip key={l.id} ouiaId={`${l.name}-label-chip`} isReadOnly>
|
||||
{l.name}
|
||||
</Chip>
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
/>
|
||||
)}
|
||||
<VariablesDetail
|
||||
dataCy="workflow-job-template-detail-extra-vars"
|
||||
helpText={helpText.variables}
|
||||
|
||||
@@ -178,46 +178,4 @@ describe('<WorkflowJobTemplateDetail/>', () => {
|
||||
expect(inventory.prop('to')).toEqual('/inventories/inventory/1/details');
|
||||
expect(organization.prop('to')).toEqual('/organizations/1/details');
|
||||
});
|
||||
|
||||
test('should not load Activity', async () => {
|
||||
await act(async () => {
|
||||
wrapper = mountWithContexts(
|
||||
<WorkflowJobTemplateDetail
|
||||
template={{
|
||||
...template,
|
||||
summary_fields: {
|
||||
...template.summary_fields,
|
||||
recent_jobs: [],
|
||||
},
|
||||
}}
|
||||
hasContentLoading={false}
|
||||
onSetContentLoading={() => {}}
|
||||
/>
|
||||
);
|
||||
});
|
||||
const activity_detail = wrapper.find(`Detail[label="Activity"]`).at(0);
|
||||
expect(activity_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
|
||||
test('should not load Labels', async () => {
|
||||
await act(async () => {
|
||||
wrapper = mountWithContexts(
|
||||
<WorkflowJobTemplateDetail
|
||||
template={{
|
||||
...template,
|
||||
summary_fields: {
|
||||
...template.summary_fields,
|
||||
labels: {
|
||||
results: [],
|
||||
},
|
||||
},
|
||||
}}
|
||||
hasContentLoading={false}
|
||||
onSetContentLoading={() => {}}
|
||||
/>
|
||||
);
|
||||
});
|
||||
const labels_detail = wrapper.find(`Detail[label="Labels"]`).at(0);
|
||||
expect(labels_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
});
|
||||
|
||||
@@ -44,28 +44,30 @@ function TemplatePopoverContent({ template }) {
|
||||
value={template?.playbook}
|
||||
dataCy={`template-${template.id}-playbook`}
|
||||
/>
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Credentials`}
|
||||
dataCy={`template-${template.id}-credentials`}
|
||||
value={
|
||||
<ChipGroup
|
||||
numChips={5}
|
||||
totalChips={template.summary_fields?.credentials?.length}
|
||||
ouiaId={`template-${template.id}-credential-chips`}
|
||||
>
|
||||
{template.summary_fields?.credentials?.map((c) => (
|
||||
<CredentialChip
|
||||
key={c.id}
|
||||
credential={c}
|
||||
isReadOnly
|
||||
ouiaId={`credential-${c.id}-chip`}
|
||||
/>
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
isEmpty={template.summary_fields?.credentials?.length === 0}
|
||||
/>
|
||||
{template.summary_fields?.credentials &&
|
||||
template.summary_fields.credentials.length ? (
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Credentials`}
|
||||
dataCy={`template-${template.id}-credentials`}
|
||||
value={
|
||||
<ChipGroup
|
||||
numChips={5}
|
||||
totalChips={template.summary_fields.credentials.length}
|
||||
ouiaId={`template-${template.id}-credential-chips`}
|
||||
>
|
||||
{template.summary_fields.credentials.map((c) => (
|
||||
<CredentialChip
|
||||
key={c.id}
|
||||
credential={c}
|
||||
isReadOnly
|
||||
ouiaId={`credential-${c.id}-chip`}
|
||||
/>
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
/>
|
||||
) : null}
|
||||
</DetailList>
|
||||
);
|
||||
}
|
||||
|
||||
@@ -309,24 +309,25 @@ function WorkflowApprovalDetail({ workflowApproval }) {
|
||||
dataCy="wa-detail-inventory"
|
||||
/>
|
||||
) : null}
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Labels`}
|
||||
value={
|
||||
<ChipGroup
|
||||
numChips={5}
|
||||
totalChips={workflowJob.summary_fields.labels.results.length}
|
||||
ouiaId="wa-detail-label-chips"
|
||||
>
|
||||
{workflowJob.summary_fields.labels.results.map((label) => (
|
||||
<Chip key={label.id} isReadOnly>
|
||||
{label.name}
|
||||
</Chip>
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
isEmpty={!workflowJob?.summary_fields?.labels?.results?.length}
|
||||
/>
|
||||
{workflowJob?.summary_fields?.labels?.results?.length > 0 && (
|
||||
<Detail
|
||||
fullWidth
|
||||
label={t`Labels`}
|
||||
value={
|
||||
<ChipGroup
|
||||
numChips={5}
|
||||
totalChips={workflowJob.summary_fields.labels.results.length}
|
||||
ouiaId="wa-detail-label-chips"
|
||||
>
|
||||
{workflowJob.summary_fields.labels.results.map((label) => (
|
||||
<Chip key={label.id} isReadOnly>
|
||||
{label.name}
|
||||
</Chip>
|
||||
))}
|
||||
</ChipGroup>
|
||||
}
|
||||
/>
|
||||
)}
|
||||
{workflowJob?.extra_vars ? (
|
||||
<VariablesDetail
|
||||
dataCy="wa-detail-variables"
|
||||
|
||||
@@ -482,33 +482,6 @@ describe('<WorkflowApprovalDetail />', () => {
|
||||
expect(wrapper.find('DeleteButton').length).toBe(1);
|
||||
});
|
||||
|
||||
test('should not load Labels', async () => {
|
||||
WorkflowJobTemplatesAPI.readDetail.mockResolvedValue({
|
||||
data: workflowJobTemplate,
|
||||
});
|
||||
WorkflowJobsAPI.readDetail.mockResolvedValue({
|
||||
data: {
|
||||
...workflowApproval,
|
||||
summary_fields: {
|
||||
...workflowApproval.summary_fields,
|
||||
labels: {
|
||||
results: [],
|
||||
},
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
let wrapper;
|
||||
await act(async () => {
|
||||
wrapper = mountWithContexts(
|
||||
<WorkflowApprovalDetail workflowApproval={workflowApproval} />
|
||||
);
|
||||
});
|
||||
waitForElement(wrapper, 'WorkflowApprovalDetail', (el) => el.length > 0);
|
||||
const labels_detail = wrapper.find(`Detail[label="Labels"]`).at(0);
|
||||
expect(labels_detail.prop('isEmpty')).toEqual(true);
|
||||
});
|
||||
|
||||
test('Error dialog shown for failed approval', async () => {
|
||||
WorkflowApprovalsAPI.approve.mockImplementationOnce(() =>
|
||||
Promise.reject(new Error())
|
||||
|
||||
@@ -1,9 +1,9 @@
|
||||
/* eslint-disable no-undef */
|
||||
importScripts('d3-collection.v1.min.js');
|
||||
importScripts('d3-dispatch.v1.min.js');
|
||||
importScripts('d3-quadtree.v1.min.js');
|
||||
importScripts('d3-timer.v1.min.js');
|
||||
importScripts('d3-force.v1.min.js');
|
||||
importScripts('https://d3js.org/d3-collection.v1.min.js');
|
||||
importScripts('https://d3js.org/d3-dispatch.v1.min.js');
|
||||
importScripts('https://d3js.org/d3-quadtree.v1.min.js');
|
||||
importScripts('https://d3js.org/d3-timer.v1.min.js');
|
||||
importScripts('https://d3js.org/d3-force.v1.min.js');
|
||||
|
||||
onmessage = function calculateLayout({ data: { nodes, links } }) {
|
||||
const simulation = d3
|
||||
|
||||
@@ -33,7 +33,6 @@ action_groups:
|
||||
- role
|
||||
- schedule
|
||||
- settings
|
||||
- subscriptions
|
||||
- team
|
||||
- token
|
||||
- user
|
||||
|
||||
@@ -23,7 +23,7 @@ options:
|
||||
manifest:
|
||||
description:
|
||||
- file path to a Red Hat subscription manifest (a .zip file)
|
||||
required: False
|
||||
required: True
|
||||
type: str
|
||||
force:
|
||||
description:
|
||||
@@ -31,11 +31,6 @@ options:
|
||||
unlicensed or trial licensed. When force=true, the license is always applied.
|
||||
type: bool
|
||||
default: 'False'
|
||||
pool_id:
|
||||
description:
|
||||
- Red Hat or Red Hat Satellite pool_id to attach to
|
||||
required: False
|
||||
type: str
|
||||
state:
|
||||
description:
|
||||
- Desired state of the resource.
|
||||
@@ -52,10 +47,6 @@ EXAMPLES = '''
|
||||
license:
|
||||
manifest: "/tmp/my_manifest.zip"
|
||||
|
||||
- name: Attach to a pool
|
||||
license:
|
||||
pool_id: 123456
|
||||
|
||||
- name: Remove license
|
||||
license:
|
||||
state: absent
|
||||
@@ -70,14 +61,12 @@ def main():
|
||||
module = ControllerAPIModule(
|
||||
argument_spec=dict(
|
||||
manifest=dict(type='str', required=False),
|
||||
pool_id=dict(type='str', required=False),
|
||||
force=dict(type='bool', default=False),
|
||||
state=dict(choices=['present', 'absent'], default='present'),
|
||||
),
|
||||
required_if=[
|
||||
['state', 'present', ['manifest', 'pool_id'], True],
|
||||
['state', 'present', ['manifest']],
|
||||
],
|
||||
mutually_exclusive=[("manifest", "pool_id")],
|
||||
)
|
||||
|
||||
json_output = {'changed': False}
|
||||
@@ -88,12 +77,11 @@ def main():
|
||||
module.delete_endpoint('config')
|
||||
module.exit_json(**json_output)
|
||||
|
||||
if module.params.get('manifest', None):
|
||||
try:
|
||||
with open(module.params.get('manifest'), 'rb') as fid:
|
||||
manifest = base64.b64encode(fid.read())
|
||||
except OSError as e:
|
||||
module.fail_json(msg=str(e))
|
||||
try:
|
||||
with open(module.params.get('manifest'), 'rb') as fid:
|
||||
manifest = base64.b64encode(fid.read())
|
||||
except OSError as e:
|
||||
module.fail_json(msg=str(e))
|
||||
|
||||
# Check if Tower is already licensed
|
||||
config = module.get_endpoint('config')['json']
|
||||
@@ -116,10 +104,7 @@ def main():
|
||||
# Do the actual install, if we need to
|
||||
if perform_install:
|
||||
json_output['changed'] = True
|
||||
if module.params.get('manifest', None):
|
||||
module.post_endpoint('config', data={'manifest': manifest.decode()})
|
||||
else:
|
||||
module.post_endpoint('config/attach', data={'pool_id': module.params.get('pool_id')})
|
||||
module.post_endpoint('config', data={'manifest': manifest.decode()})
|
||||
|
||||
module.exit_json(**json_output)
|
||||
|
||||
|
||||
@@ -1,101 +0,0 @@
|
||||
#!/usr/bin/python
|
||||
# coding: utf-8 -*-
|
||||
|
||||
# (c) 2019, John Westcott IV <john.westcott.iv@redhat.com>
|
||||
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
|
||||
|
||||
from __future__ import absolute_import, division, print_function
|
||||
|
||||
__metaclass__ = type
|
||||
|
||||
ANSIBLE_METADATA = {'metadata_version': '1.1', 'status': ['preview'], 'supported_by': 'community'}
|
||||
|
||||
|
||||
DOCUMENTATION = '''
|
||||
---
|
||||
module: subscriptions
|
||||
author: "John Westcott IV (@john-westcott-iv)"
|
||||
short_description: Get subscription list
|
||||
description:
|
||||
- Get subscriptions available to Automation Platform Controller. See
|
||||
U(https://www.ansible.com/tower) for an overview.
|
||||
options:
|
||||
username:
|
||||
description:
|
||||
- Red Hat or Red Hat Satellite username to get available subscriptions.
|
||||
- The credentials you use will be stored for future use in retrieving renewal or expanded subscriptions
|
||||
required: True
|
||||
type: str
|
||||
password:
|
||||
description:
|
||||
- Red Hat or Red Hat Satellite password to get available subscriptions.
|
||||
- The credentials you use will be stored for future use in retrieving renewal or expanded subscriptions
|
||||
required: True
|
||||
type: str
|
||||
filters:
|
||||
description:
|
||||
- Client side filters to apply to the subscriptions.
|
||||
- For any entries in this dict, if there is a corresponding entry in the subscription it must contain the value from this dict
|
||||
- Note This is a client side search, not an API side search
|
||||
required: False
|
||||
type: dict
|
||||
extends_documentation_fragment: awx.awx.auth
|
||||
'''
|
||||
|
||||
RETURN = '''
|
||||
subscriptions:
|
||||
description: dictionary containing information about the subscriptions
|
||||
returned: If login succeeded
|
||||
type: dict
|
||||
'''
|
||||
|
||||
EXAMPLES = '''
|
||||
- name: Get subscriptions
|
||||
subscriptions:
|
||||
username: "my_username"
|
||||
password: "My Password"
|
||||
|
||||
- name: Get subscriptions with a filter
|
||||
subscriptions:
|
||||
username: "my_username"
|
||||
password: "My Password"
|
||||
filters:
|
||||
product_name: "Red Hat Ansible Automation Platform"
|
||||
support_level: "Self-Support"
|
||||
'''
|
||||
|
||||
from ..module_utils.controller_api import ControllerAPIModule
|
||||
|
||||
|
||||
def main():
|
||||
|
||||
module = ControllerAPIModule(
|
||||
argument_spec=dict(
|
||||
username=dict(type='str', required=True),
|
||||
password=dict(type='str', no_log=True, required=True),
|
||||
filters=dict(type='dict', required=False, default={}),
|
||||
),
|
||||
)
|
||||
|
||||
json_output = {'changed': False}
|
||||
|
||||
# Check if Tower is already licensed
|
||||
post_data = {
|
||||
'subscriptions_password': module.params.get('password'),
|
||||
'subscriptions_username': module.params.get('username'),
|
||||
}
|
||||
all_subscriptions = module.post_endpoint('config/subscriptions', data=post_data)['json']
|
||||
json_output['subscriptions'] = []
|
||||
for subscription in all_subscriptions:
|
||||
add = True
|
||||
for key in module.params.get('filters').keys():
|
||||
if subscription.get(key, None) and module.params.get('filters')[key] not in subscription.get(key):
|
||||
add = False
|
||||
if add:
|
||||
json_output['subscriptions'].append(subscription)
|
||||
|
||||
module.exit_json(**json_output)
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
main()
|
||||
@@ -41,7 +41,6 @@ no_endpoint_for_module = [
|
||||
'workflow_template',
|
||||
'ad_hoc_command_wait',
|
||||
'ad_hoc_command_cancel',
|
||||
'subscriptions', # Subscription deals with config/subscriptions
|
||||
]
|
||||
|
||||
# Global module parameters we can ignore
|
||||
|
||||
@@ -283,11 +283,9 @@ class ApiV2(base.Base):
|
||||
|
||||
_page = _page.put(post_data)
|
||||
changed = True
|
||||
except exc.NoContent: # desired exception under some circumstances, e.g. labels that already exist
|
||||
pass
|
||||
except (exc.Common, AssertionError) as e:
|
||||
identifier = asset.get("name", None) or asset.get("username", None) or asset.get("hostname", None)
|
||||
log.error(f'{endpoint} "{identifier}": {e}.')
|
||||
log.error(f"{endpoint} \"{identifier}\": {e}.")
|
||||
self._has_error = True
|
||||
log.debug("post_data: %r", post_data)
|
||||
continue
|
||||
|
||||
@@ -35,7 +35,16 @@ For the following playbooks to work, you will need to:
|
||||
$ pip install openshift
|
||||
```
|
||||
|
||||
If you are not changing any code in the operator itself, git checkout the latest version from https://github.com/ansible/awx-operator/releases, and then follow the instructions in the awx-operator [README](https://github.com/ansible/awx-operator#basic-install).
|
||||
If you are not changing any code in the operator itself, git checkout the latest version from https://github.com/ansible/awx-operator/releases, and then run the following command (from the awx-operator repo):
|
||||
|
||||
```
|
||||
$ alias kubectl="minikube kubectl --"
|
||||
$ export NAMESPACE=my-namespace
|
||||
$ kubectl create namespace $NAMESPACE
|
||||
$ kubectl config set-context --current --namespace=$NAMESPACE
|
||||
$ make deploy
|
||||
|
||||
```
|
||||
|
||||
If making changes to the operator itself, run the following command in the root
|
||||
of the awx-operator repo. If not, continue to the next section.
|
||||
@@ -48,6 +57,7 @@ $ export IMAGE_TAG_BASE=quay.io/<username>/awx-operator
|
||||
$ export VERSION=<cusom-tag>
|
||||
$ make docker-build
|
||||
$ docker push ${IMAGE_TAG_BASE}:${VERSION}
|
||||
$ make deploy
|
||||
```
|
||||
|
||||
## Deploy AWX into Minikube using the AWX Operator
|
||||
|
||||
@@ -45,7 +45,7 @@ python-dsv-sdk
|
||||
python-tss-sdk==1.0.0
|
||||
python-ldap>=3.4.0 # https://github.com/ansible/awx/security/dependabot/20
|
||||
pyyaml>=5.4.1 # minimum to fix https://github.com/yaml/pyyaml/issues/478
|
||||
receptorctl==1.2.3
|
||||
receptorctl==1.1.1
|
||||
schedule==0.6.0
|
||||
social-auth-core==4.2.0 # see UPGRADE BLOCKERs
|
||||
social-auth-app-django==5.0.0 # see UPGRADE BLOCKERs
|
||||
|
||||
@@ -309,7 +309,7 @@ pyyaml==5.4.1
|
||||
# djangorestframework-yaml
|
||||
# kubernetes
|
||||
# receptorctl
|
||||
receptorctl==1.2.3
|
||||
receptorctl==1.1.1
|
||||
# via -r /awx_devel/requirements/requirements.in
|
||||
redis==3.4.1
|
||||
# via
|
||||
|
||||
@@ -5,16 +5,15 @@
|
||||
tasks:
|
||||
- name: Get version from SCM if not explicitly provided
|
||||
shell: |
|
||||
make print-VERSION | cut -d + -f -1
|
||||
python3 -m setuptools_scm | cut -d + -f -1
|
||||
args:
|
||||
chdir: '../../'
|
||||
register: scm_version
|
||||
failed_when: not scm_version.stdout
|
||||
register: setup_py_version
|
||||
when: awx_version is not defined
|
||||
|
||||
- name: Set awx_version
|
||||
set_fact:
|
||||
awx_version: "{{ scm_version.stdout }}"
|
||||
awx_version: "{{ setup_py_version.stdout }}"
|
||||
when: awx_version is not defined
|
||||
|
||||
- include_role:
|
||||
|
||||
@@ -43,7 +43,7 @@ RUN dnf -y update && dnf install -y 'dnf-command(config-manager)' && \
|
||||
xmlsec1-devel \
|
||||
xmlsec1-openssl-devel
|
||||
|
||||
RUN pip3 install virtualenv build
|
||||
RUN pip3 install virtualenv setuptools_scm build
|
||||
|
||||
|
||||
# Install & build requirements
|
||||
@@ -152,7 +152,7 @@ RUN dnf -y install \
|
||||
unzip && \
|
||||
npm install -g n && n 16.13.1 && npm install -g npm@8.5.0 && dnf remove -y nodejs
|
||||
|
||||
RUN pip3 install black git+https://github.com/coderanger/supervisor-stdout setuptools-scm
|
||||
RUN pip3 install black git+https://github.com/coderanger/supervisor-stdout
|
||||
|
||||
# This package randomly fails to download.
|
||||
# It is nice to have in the dev env, but not necessary.
|
||||
@@ -246,7 +246,6 @@ RUN for dir in \
|
||||
/etc/containers \
|
||||
/var/lib/awx/.config/containers \
|
||||
/var/lib/awx/.config/cni \
|
||||
/var/lib/awx/.local \
|
||||
/var/lib/awx/venv \
|
||||
/var/lib/awx/venv/awx/bin \
|
||||
/var/lib/awx/venv/awx/lib/python3.9 \
|
||||
|
||||
@@ -1,48 +0,0 @@
|
||||
#!/bin/bash
|
||||
|
||||
# Rename the zh_cn folder
|
||||
mv translations/zh_cn translations/zh
|
||||
|
||||
# Create a directory for api (locale)
|
||||
mkdir locale
|
||||
|
||||
# Copy all subdirectories to locale
|
||||
cp -r translations/ locale/
|
||||
|
||||
# Loop over each directory and create another directory LC_Messages
|
||||
# Move django.po files to LC_Messages and remove messages.po
|
||||
cd locale/
|
||||
for d in */ ; do
|
||||
dir=${d%*/}
|
||||
mkdir $dir/LC_MESSAGES
|
||||
mv $dir/django.po $dir/LC_MESSAGES/
|
||||
rm $dir/messages.po
|
||||
done
|
||||
|
||||
cd ..
|
||||
|
||||
# Create a directory for ui (locales)
|
||||
mkdir locales
|
||||
|
||||
# Copy all subdirectories to locales
|
||||
cp -r translations/ locales/
|
||||
|
||||
# Loop over each directory and remove django.po
|
||||
cd locales
|
||||
for d in */ ; do
|
||||
dir=${d%*/}
|
||||
rm $dir/django.po
|
||||
done
|
||||
|
||||
cd ..
|
||||
|
||||
awx_api_path="awx/locale" # locale will be dropped here
|
||||
awx_ui_path="awx/ui/src/locales" # locales will be dropped here
|
||||
|
||||
rsync -av locale/ $awx_api_path
|
||||
rsync -av locales/ $awx_ui_path
|
||||
|
||||
rm -rf translations/
|
||||
rm -rf locale/
|
||||
rm -rf locales/
|
||||
|
||||
@@ -1,10 +0,0 @@
|
||||
#!/bin/bash
|
||||
|
||||
# Extract Strings from API & UI
|
||||
make docker-compose-sources
|
||||
docker-compose -f tools/docker-compose/_sources/docker-compose.yml run awx_1 make awx-link migrate po messages
|
||||
|
||||
# Move extracted Strings to Translation Directory
|
||||
mv awx/locale/en-us/LC_MESSAGES/django.po translations/
|
||||
mv awx/ui/src/locales/en/messages.po translations/
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user