GithubHelp home page GithubHelp logo

pulp / pulp-2-tests Goto Github PK

View Code? Open in Web Editor NEW
1.0 3.0 11.0 481 KB

:warning: ⛔️ Pulp2 is EOL as of November 30 2022, for more info visit this link https://pulpproject.org/2022/09/19/pulp-2-eol/. ⛔️ Functional tests for Pulp 2.

Home Page: https://pulp-2-tests.readthedocs.io/

Makefile 0.27% Python 99.44% Shell 0.30%
pulp pulp-project testing quality-assurance quality-engineering qe python-test redhat redhat-qe

pulp-2-tests's Introduction

pulp-2-tests's People

Contributors

bherrin3 avatar dkliban avatar dralley avatar ichimonji10 avatar ipanova avatar omaciel avatar ragabala avatar rochacbruno avatar

Stargazers

 avatar

Watchers

 avatar  avatar  avatar

pulp-2-tests's Issues

Discussion: Changes to be made for `py.test --random-order` implementation in CI

Synopsis of Change

Part of the improvement to the CI in the pulp-ci review doc is finding a way to to run tests using random order.

pytest-random-order

Positive

  • The pytest plug-in pytest-random-order [0] provides that ability.
  • Env Requres the pytest plugin

Negative

  • Each test requires pytestmark = pytest.mark.random_order(disabled=True) within each test, wherever this is implemented (Pulp-2-Tests or per test for Pulp3)
  • Change to pulp-ci for the anisible job pulp-smash-runner.yaml [2] to execute py.test -v --color=yes --random-order --junit-xml=junit-report.xml --pyargs pulp_2_tests.tests
  • [Pulp3] Change each travis job for each plug-in to add --random-order

pytest-randomly

  • Has no controls and looks like a subset of pytest-random-order

pytest-ordering

  • Looks the same as pytest-random-order, using the pytest.mark notation more extensively
  • Even more hand-managed for tests (not what we are looking)

Background

For the way we would like random order to work, which is to say each class' tests should be in the fixed order they were written but all executions outside of the class can run randomly, means a simple command-line argument is not what we need.

From Ref [0], the section Disable Shuffling in Module or Class documents the changes we would need to make.

You can disable shuffling of tests within a single module or class by marking the module or class with random_order marker and passing disabled=True to it:

pytestmark = pytest.mark.random_order(disabled=True)

def test_number_one():
    assert True

def test_number_two():
    assert True
class MyTest(TestCase):
    pytestmark = pytest.mark.random_order(disabled=True)

    def test_number_one(self):
        self.assertTrue(True)

No matter what will be the bucket type for the test run, test_number_one will always run before test_number_two.

There were also examples of parameterizing tests with @pytest that may be useful. I did not put enough time into looking at that but I am providing a reference [3].

Testing

To ensure this is what we indeed wanted, I test this on a mock-up [1] [4] which can be reviewed or expanded upon.

Also, I created a test branch in Pulp-2-Tests with a single folder implemented in this manner with some results and the ability to pull and test. [5]

Summary

Before making wide and sweeping changes, it appears this would be a change in how we do tests.

To update the tests, the change would be very specific, but there are a lot of files to update, however, not very hard to do.

References

[0] - https://pypi.org/project/pytest-random-order/
[1] - https://paste.fedoraproject.org/paste/zcn-awO7XkdsfXXC0Oq84w
[2] - https://github.com/pulp/pulp-ci/blob/f19ba59942542da1940b217b9fa2f0232b285190/ci/jjb/jobs/pulp-smash-runner.yaml#L123
[3] - https://docs.pytest.org/en/latest/example/parametrize.html#a-quick-port-of-testscenarios
[4] - https://paste.fedoraproject.org/paste/e4qq0e4zzC-71BhLrWhFJw
[5] - bherrin3#1

As a user, I can manage modular Errata content

Description

The erratum updateinfo file just received a format update for it to support referring to modules.
A good example of an updateinfo file can be seen in the modularity fixture repository. The TL;DR of the format update is that a collection element can now carry a module child element, in which case the collection packages belong to the referred module:

<updates>
  <update status="stable" from="[email protected]" version="1" type="enhancement">
    <id>RHEA-2012:0059</id>
    <issued date="2018-01-27 16:08:09" />
    <title>Duck_Kangaroo_Erratum</title>
    <release>1</release>
    <description>Duck_Kangaro_Erratum description</description>
    <updated date="2018-07-20 06:00:01 UTC" />
    <references />
    <pkglist>
      <collection short="modulerr_kangaroo_1">
        <name>modulerr_kangaroo_1</name>
        <module stream="0" version="20180730223407" arch="noarch" name="kangaroo" context="deadbeef" />
        <package src="http://www.fedoraproject.org" name="kangaroo" epoch="0" version="0.3" release="1" arch="noarch">
          <filename>kangaroo-0.3-1.noarch.rpm</filename>
        </package>
      </collection>
    <!-- ------\>%------- -->
    </pkglist>
  </update>
</updates>

Steps

  • create a repo with the feed pointing to the modularity fixture repository
  • synchronize and publish the repository
  • the published repository updateinfo file should contain the same modular references as the fixture repo 🍐
  • check that all non-modular (ursine) beneath an update element rpms are aggregated in a single collection packages list 🍐
  • check that every module has a custom collection 🍐
  • check that each collection name is unique beneath the update element i.e the name has a _<module_name>_<arbitrary increasing number> suffix 🍐
  • create another, empty repo, without a feed
  • recursively copy e.g the RHEA-2012:0059 erratum into the other repo
  • check that any modules referred to by the erratum were copied, including their respective rpms 🍐
  • follow my modular erratum upload gist (or similar) to upload custom modular erratum into the other repo
  • publish the other repo
  • check that the custom erratum was published 🍐

Notes

🪲 https://pulp.plan.io/issues/3919

Test Pulp OSTree to create a new unit for each commit version on a branch Story #2594

Migrated from pulp/pulp-smash#639
author: @preethit - https://github.com/preethit
date created: 2017-05-10T18:53:33Z

https://pulp.plan.io/issues/2594
Link to the test plan
https://pulp.plan.io/issues/2594?pn=1#note-12

  1. Create an ostree repository and commit a tree using /etc/pulp as the source.

  2. Create and sync a repository.

  3. List content and note that a content unit was created for the branch HEAD.

  4. Make a few more commits:

  5. Sync again and list the content.

    • The list should contain the first branch HEAD commit
    • Intermediate commits (this is new)
    • The current branch HEAD commit.
  6. Ignore the order.

Test the sync /upload of RPM with large metadata

Migrated from pulp/pulp-smash#497
author: @goosemania - https://github.com/goosemania
date created: 2017-01-20T01:56:33Z

With Pulp 2.12.1+ , metadata for RPM/SRPM units will be stored in the compressed form in DB. This change allows Pulp to have packages with larger metadata than it was possible before.

For testing purposes one will need:

Verify that sync or upload of such RPM does not result in DocumentTooLarge error.

As for the upgrade, the migration will be run and the metadata for all the existing units will be compressed. I do not think there is something specific to test here, just make sure all the rpm sync/upload/publish-related tests are still working after upgrade.

Redmine issue: https://pulp.plan.io/issues/723

As a user I would like to limit the tags we sync for docker repos

Migrated from pulp/pulp-smash#1064
author: @werwty - https://github.com/werwty
date created: 2018-06-12T18:02:59Z

https://pulp.plan.io/issues/3450

This feature adds a tags whiltelisting option to pulp_docker, but only for v2 content.

Testing v2 content

  1. pulp-admin docker repo create --repo-id=v2whitelist --feed=https://registry-1.docker.io --upstream-name=busybox --tags=latest,1
  2. pulp-admin docker repo sync run --repo-id v2whitelist
  3. validate that the synced units only contain latest and 1 tag

Refactor all uses of copy_unit to a new base class and MixIn

Problem

In attempting to get throughput and refactor test cases that used copy_units, there is some duplication of code as each are a class method.

Solution

For all similar instances, refactor the use of copy_units to be abstracted for any class to call and use.

Assert then YumMetadataFile copy save its new storage_path.

Migrated from pulp/pulp-smash#279
author: @ipanova - https://github.com/ipanova
date created: 2016-05-31T15:43:31Z

  1. create repo A
  2. sync from a feed or upload yum_metadata_file
  3. create repo B
  4. copy yum_metadata_file from repo A to B
  5. publish repo B
  6. remove yum_metadata_file from repo A
  7. remove orphans
  8. publish repo B again ---> publish should succeed

for more info check https://pulp.plan.io/issues/1944
https://pulp.plan.io/issues/1935#note-5
pulp/pulp_rpm#895

Replace parent class BaseAPITestCase by unittest.TestCase

Migrated from pulp/pulp-smash#763
author: @kersommoura - https://github.com/kersommoura
date created: 2017-08-22T18:57:32Z

Few tests fail due to time-out that happening during the execution of tear down.
To reduce the number of calls to client.delete(ORPHANS_PATH), and try to reduce the number of failures. Replace the BaseAPITestCase by unittest.TestCase.

    @classmethod
    def tearDownClass(cls):
        """Delete all resources named by ``resources``."""
        client = api.Client(cls.cfg)
        for resource in cls.resources:
            client.delete(resource)
        client.delete(ORPHANS_PATH)

Basically, it will be necessary pass class by class, replace the parent class and do the necessary re-factors.

See also: #762

As a user i can manage modular content

Migrated from pulp/pulp-smash#1122
author: @ipanova - https://github.com/ipanova
date created: 2018-08-08T12:09:08Z

Sync/publish scenario

  1. create and sync modular repo
  2. validate that in the repo there are identified modulemd and modulemd_defaults.
  3. publish repo.
  4. validate that compressed modules.yaml doc is available in the repodata
  5. test dnf client is able to install module from the metadata produced by pulp

Sync/copy scenario ( modulemd)

  1. create and sync modular repo
  2. copy module A , ensure destination repo has module A
  3. copy recursively module B, ensure destination repo has modulee B + its rpms( artifacts)

Sync/remove scenario ( modulemd)

  1. create and sync modular repo
  2. remove module A, ensure its rpms are removed from the repo as well. Only not cross referenced rpms in other modules within the repo should be removed. This way we ensure we do not leave after us ursine modular rpm. ( ursine == modular rpm not belonging to any module)

Sync/copy scenario ( modulemd_defaults)

  1. create and sync modular repo
  2. copy modulemd-defauls A , ensure destination repo has modulemd-defaults A. Nothing else is carried over.

1 defaults per repo scenario ( modulemd_defaults)

  1. create and sync modular repo
  2. copy modulemd-defauls A , ensure destination repo1 has modulemd-defaults A. Nothing else is carried over.
  3. upload modulemd-defaults A( same unit key, different content) yaml file. Ensure destination repo1 has just one modulemd-defaults A (we can have just one defaults of module A per repo).

Sync/remove scenario ( modulemd_defaults)

  1. create and sync modular repo
  2. remove modulemd-defaults A, ensure it was removed from the repo, Nothing else is removed.

Upload scenario ( modulemd and defaults)

  1. upload a modules.yaml doc file that would containt 1+ of modulemd and 1+ of modulemd_defaults.
  2. Ensure that pulp was able to parse the document by identifying correct content type modulemd and modulemd_defaults

Timeout error on OpenSuseErrataTestCase sync operation

the test :

tests/rpm/api_v2/test_updateinfo.py::OpenSuseErrataTestCase::test_01_create_sync_publish <- ../../.virtualenvs/pulp-2-tests/lib64/python3.5/site-packages/pulp_2_tests/tests/rpm/api_v2/test_updateinfo.py FAILED 

is failing with:

pulp_smash.exceptions.TaskTimedOutError: Task /pulp/api/v2/tasks/007c7852-068f-49a3-b8e9-63a38bffc7d0/ is ongoing after 900 polls.

when trying to sync from: https://download.opensuse.org/update/leap/42.3/oss/

The test passes locally, but takes long time to finish, we should find a way to make it faster (local fixture or smaller repo)

complete traceback:

17:58:14 =================================== FAILURES ===================================
17:58:14 ______________ SyncInvalidFeedTestCase.test_task_error_traceback _______________
17:58:14 
17:58:14 self = <pulp_2_tests.tests.puppet.api_v2.test_sync_publish.SyncInvalidFeedTestCase testMethod=test_task_error_traceback>
17:58:14 
17:58:14     def test_task_error_traceback(self):
17:58:14         """Assert the task's "error" and "traceback" fields are non-null."""
17:58:14         for key in {'error', 'traceback'}:
17:58:14             with self.subTest(key=key):
17:58:14 >               self.assertIsNotNone(self.tasks[0][key])
17:58:14 E               AssertionError: unexpectedly None
17:58:14 
17:58:14 ../../.virtualenvs/pulp-2-tests/lib64/python3.5/site-packages/pulp_2_tests/tests/puppet/api_v2/test_sync_publish.py:213: AssertionError
17:58:14 ______________ OpenSuseErrataTestCase.test_01_create_sync_publish ______________
17:58:14 
17:58:14 self = <pulp_2_tests.tests.rpm.api_v2.test_updateinfo.OpenSuseErrataTestCase testMethod=test_01_create_sync_publish>
17:58:14 
17:58:14     def test_01_create_sync_publish(self):
17:58:14         """Create, sync, and publish an openSUSE repository."""
17:58:14         cfg = config.get_config()
17:58:14         if not selectors.bug_is_fixed(3377, cfg.pulp_version):
17:58:14             self.skipTest('https://pulp.plan.io/issues/3377')
17:58:14     
17:58:14         # Create, sync, and publish a repository.
17:58:14         client = api.Client(cfg, api.json_handler)
17:58:14         body = gen_repo()
17:58:14         body['importer_config']['download_policy'] = 'on_demand'
17:58:14         body['importer_config']['feed'] = OPENSUSE_FEED_URL
17:58:14         body['distributors'] = [gen_distributor()]
17:58:14         repo = client.post(REPOSITORY_PATH, body)
17:58:14         self.addCleanup(client.delete, repo['_href'])
17:58:14 >       sync_repo(cfg, repo)
17:58:14 
17:58:14 ../../.virtualenvs/pulp-2-tests/lib64/python3.5/site-packages/pulp_2_tests/tests/rpm/api_v2/test_updateinfo.py:621: 
17:58:14 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
17:58:14 ../../.virtualenvs/pulp-2-tests/lib64/python3.5/site-packages/pulp_smash/pulp2/utils.py:483: in sync_repo
17:58:14     return api.Client(cfg).post(urljoin(repo['_href'], 'actions/sync/'))
17:58:14 ../../.virtualenvs/pulp-2-tests/lib64/python3.5/site-packages/pulp_smash/api.py:443: in post
17:58:14     return self.request('POST', url, **kwargs)
17:58:14 ../../.virtualenvs/pulp-2-tests/lib64/python3.5/site-packages/pulp_smash/api.py:481: in request
17:58:14     requests.request(method, **request_kwargs),
17:58:14 ../../.virtualenvs/pulp-2-tests/lib64/python3.5/site-packages/pulp_smash/api.py:135: in safe_handler
17:58:14     _handle_202(client._cfg, response, client.pulp_host)  # pylint:disable=protected-access
17:58:14 ../../.virtualenvs/pulp-2-tests/lib64/python3.5/site-packages/pulp_smash/api.py:83: in _handle_202
17:58:14     tasks = tuple(poll_spawned_tasks(cfg, call_report, pulp_host))
17:58:14 ../../.virtualenvs/pulp-2-tests/lib64/python3.5/site-packages/pulp_smash/api.py:507: in poll_spawned_tasks
17:58:14     for final_task_state in poll_task(cfg, href, pulp_host):
17:58:14 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
17:58:14 
17:58:14 cfg = PulpSmashConfig(pulp_version='2.16.4', pulp_auth=['admin', 'admin'], hosts=[PulpHost(hostname='host-172-16-46-13.opens...erify': '/etc/pki/ca-trust/source/anchors/pulp-cacert.pem'}, 'pulp resource manager': {}})], pulp_selinux_enabled=True)
17:58:14 href = '/pulp/api/v2/tasks/007c7852-068f-49a3-b8e9-63a38bffc7d0/'
17:58:14 pulp_host = PulpHost(hostname='host-172-16-46-13.openstacklocal', roles={'amqp broker': {'service': 'qpidd'}, 'pulp cli': {}, 'pul...'api': {'scheme': 'https', 'verify': '/etc/pki/ca-trust/source/anchors/pulp-cacert.pem'}, 'pulp resource manager': {}})
17:58:14 
17:58:14     def poll_task(cfg, href, pulp_host=None):
17:58:14         """Wait for a task and its children to complete. Yield response bodies.
17:58:14     
17:58:14         Poll the task at ``href``, waiting for the task to complete. When a
17:58:14         response is received indicating that the task is complete, yield that
17:58:14         response body and recursively poll each child task.
17:58:14     
17:58:14         :param cfg: A :class:`pulp_smash.config.PulpSmashConfig` object.
17:58:14         :param href: The path to a task you'd like to monitor recursively.
17:58:14         :param pulp_host: The host to poll. If ``None``, a host will automatically
17:58:14             be selected by :class:`Client`.
17:58:14         :returns: An generator yielding response bodies.
17:58:14         :raises pulp_smash.exceptions.TaskTimedOutError: If a task takes too
17:58:14             long to complete.
17:58:14         """
17:58:14         # 900 * 2s == 1800s == 30m
17:58:14         # NOTE: The timeout counter is synchronous. We query Pulp, then count down,
17:58:14         # then query pulp, then count down, etc. This is… dumb.
17:58:14         poll_limit = 900
17:58:14         poll_counter = 0
17:58:14         json_client = Client(cfg, json_handler, pulp_host=pulp_host)
17:58:14         while True:
17:58:14             task = json_client.get(href)
17:58:14             if cfg.pulp_version < Version('3'):
17:58:14                 task_end_states = _TASK_END_STATES
17:58:14             else:
17:58:14                 task_end_states = _P3_TASK_END_STATES
17:58:14             if task['state'] in task_end_states:
17:58:14                 # This task has completed. Yield its final state, then recursively
17:58:14                 # iterate through children and yield their final states.
17:58:14                 yield task
17:58:14                 for spawned_task in task['spawned_tasks']:
17:58:14                     for descendant_tsk in poll_task(cfg, spawned_task['_href'], pulp_host):
17:58:14                         yield descendant_tsk
17:58:14                 break
17:58:14             poll_counter += 1
17:58:14             if poll_counter > poll_limit:
17:58:14                 raise exceptions.TaskTimedOutError(
17:58:14 >                   'Task {} is ongoing after {} polls.'.format(href, poll_limit)
17:58:14                 )
17:58:14 E               pulp_smash.exceptions.TaskTimedOutError: Task /pulp/api/v2/tasks/007c7852-068f-49a3-b8e9-63a38bffc7d0/ is ongoing after 900 polls.
17:58:14 
17:58:14 ../../.virtualenvs/pulp-2-tests/lib64/python3.5/site-packages/pulp_smash/api.py:549: TaskTimedOutError

As a user I can sync Oracle Linux 7 which has Pulp had a bug on syncing before

Migrated from pulp/pulp-smash#1065
author: @bmbouter - https://github.com/bmbouter
date created: 2018-06-12T18:36:57Z

This is for bug https://pulp.plan.io/issues/3535

To sum up the issue, when sync+publishing an Oracle Linux 7 Repo Pulp threw a fatal exception on Publish and could not publish the repo. With the fix the repo publishes.

There reproducer steps I used during development are the second batch of ones (OL7) here: https://pulp.plan.io/issues/3535#note-15

Test "conservative" recursive copy

Normal recursive copy will copy all dependencies to the target repo if versions of those dependencies are newer than in the target repo.

"Conservative" recursive copy will perform depsolving to determine whether the dependency is satisfied in the target repository, and if so, will not copy the newer deps.

See the linked issue for a more concrete example.

Additionally, this flag should NOT affect copies of modular content, only copies of non-modular content. All dependencies of modular content should always be copied, just like a normal recursive copy, even if this flag is specified.

https://pulp.plan.io/issues/2478
pulp/pulp_rpm@cb4d33c

Non-unique collection names in erratum pkglists causes yum not to show all packages

Migrated from pulp/pulp-smash#539
author: @preethit - https://github.com/preethit
date created: 2017-02-03T19:24:54Z

https://pulp.plan.io/issues/2560

  1. create repo A1 with packages P1, P2 and erratum E and packages P1 and P2 in its pkglist/collection.
  2. create repo A2 which is a copy of repo A1.
  3. remove P1 from repo A1
  4. remove P2 from repo A2
  5. publish both repos
  6. enable both repos for your yum client
  7. run yum updateinfo list (assuming you have old versions of P1 and P2 installed)
  8. observe only one package (P1 or P2, it depends which repo was "the first") listed for update

Test to make sure that multiple versions of puppet modules can be uploaded and published

Migrated from pulp/pulp-smash#757
author: @preethit - https://github.com/preethit
date created: 2017-08-17T17:29:55Z

1.Upload puppet a module with multiple versions.(Eg : motd 1.2.0 and motd 1.2.1)
2.Publish the repo with motd-1.2.0.
3.Now, add motd 1.2.1 module to the same repo
4. Publish the repo

Second Scenario

1.Upload puppet a module with multiple versions.(Eg : motd 1.2.0 and motd 1.2.1)
2.Publish the repo with motd-1.2.0.
3.Now, update the same repo with a feed which has motd 1.2.1 module
4. Sync/ Publish the repo

FixFileCorruptionTestCase fails when run in isolation

This fails:

python -m unittest --failfast --verbose pulp_2_tests.tests.rpm.api_v2.test_download_policies.FixFileCorruptionTestCase

This succeeds:

python -m unittest --failfast --verbose pulp_2_tests.tests.rpm.api_v2.test_download_policies

Looks like there's a design bug with FixFileCorruptionTestCase. It should be tracked down and fixed so that the test case can be run in isolation. Notably, the test case inherits from the deprecated BaseAPITestCase.

As a user i can manage modular content

Migrated from pulp/pulp-smash#1122
author: @ipanova - https://github.com/ipanova
date created: 2018-08-08T12:09:08Z

Sync/publish scenario

  1. create and sync modular repo
  2. validate that in the repo there are identified modulemd and modulemd_defaults.
  3. publish repo.
  4. validate that compressed modules.yaml doc is available in the repodata
  5. test dnf client is able to install module from the metadata produced by pulp

Sync/copy scenario ( modulemd)

  1. create and sync modular repo
  2. copy module A , ensure destination repo has module A
  3. copy recursively module B, ensure destination repo has modulee B + its rpms( artifacts)

Sync/remove scenario ( modulemd)

  1. create and sync modular repo
  2. remove module A, ensure its rpms are removed from the repo as well. Only not cross referenced rpms in other modules within the repo should be removed. This way we ensure we do not leave after us ursine modular rpm. ( ursine == modular rpm not belonging to any module)

Sync/copy scenario ( modulemd_defaults)

  1. create and sync modular repo
  2. copy modulemd-defauls A , ensure destination repo has modulemd-defaults A. Nothing else is carried over.

1 defaults per repo scenario ( modulemd_defaults)

  1. create and sync modular repo
  2. copy modulemd-defauls A , ensure destination repo1 has modulemd-defaults A. Nothing else is carried over.
  3. upload modulemd-defaults A( same unit key, different content) yaml file. Ensure destination repo1 has just one modulemd-defaults A (we can have just one defaults of module A per repo).

Sync/remove scenario ( modulemd_defaults)

  1. create and sync modular repo
  2. remove modulemd-defaults A, ensure it was removed from the repo, Nothing else is removed.

Upload scenario ( modulemd and defaults)

  1. upload a modules.yaml doc file that would containt 1+ of modulemd and 1+ of modulemd_defaults.
  2. Ensure that pulp was able to parse the document by identifying correct content type modulemd and modulemd_defaults

Test puppet modules with extraneous files

Migrated from pulp/pulp-smash#627
author: @daviddavis - https://github.com/daviddavis
date created: 2017-04-27T22:06:56Z

See this issue for a list of potential puppet modules that could contain extraneous files:

https://pulp.plan.io/issues/1846

marcel-passenger-0.5.0.tar.gz is a good one.

When you extract one of these modules, it should contain something to the effect of:

-rwxr-xr-x   1  501 user  311 Mar 27  2015 ._marcel-passenger-0.5.0
drwxr-xr-x   7  501 user 4.0K Mar 27  2015 marcel-passenger-0.5.0

The ._marcel-passenger-0.5.0 file is a MacOS metadata file. It was causing our code to fail before.

Steps

  1. Create a puppet repo
  2. Upload the puppet module to your repo
  3. It should be successful

This original bug was a bit nondeterministic. Sometimes it occurred, sometimes it did not (depending on the order of the files listed by os.listdir()). The fix should solve it though by ignoring the file.

Test that errata metadata is updated when needed

Migrated from pulp/pulp-smash#237
author: @goosemania - https://github.com/goosemania
date created: 2016-05-06T11:13:21Z

Erratum metadata can be updated during sync or upload of the erratum (with the same id).
The decision to update it or not is made based on the updated field of the erratum.
The expected format for updated field is '%Y-%m-%d %H:%M:%S UTC' (UTC is optional).

Suggestions for the tests:

  • the erratum is updated if updated field indicates that the synced or uploaded erratum is newer
  • the erratum is not updated if updated field indicates that the synced or uploaded erratum is older
  • the erratum is not updated if updated field of either the existing erratum or the newly synced/uploaded erratum is in the wrong format (task will fail in this case).

Main steps:

  1. Create and sync the repo which contains at least one erratum.
    2.1 Create and sync the other repo which contains the erratum with the same id as the repo in step 1 but with different date in the updated field.
    2.2 Or upload the erratum with the same id as in the repo in step 1 but with different date in the updated field.

The way to upload the erratum and to search the erratum with a particular id.
Here is the list of the erratum fields which can be updated.

Pulp issue: https://pulp.plan.io/issues/858

Adds task-ids to logs

Migrated from pulp/pulp-smash#483
author: @bmbouter - https://github.com/bmbouter
date created: 2017-01-13T20:34:21Z

This is for story: https://pulp.plan.io/issues/2324

Verify a log statement running inside a task

  1. Sync the zoo repo: pulp-admin rpm repo sync run --repo-id zoo
  2. Verify that you see the log statement [75suq9sy] pulp_rpm.plugins.importers.yum.sync:INFO: Parsing metadata. Note that the characters between the brackets will be different.

Verify a log statement running outside a task

  1. Try to login with a bad username and password: pulp-admin login -u admin -p notright
  2. Verify that you see the log statement pulp.server.webservices.middleware.exception:INFO: Authentication with username admin failed: invalid username or password

Test the puppet install distributor subdir option

Migrated from pulp/pulp-smash#495
author: @werwty - https://github.com/werwty
date created: 2017-01-19T21:11:46Z

https://pulp.plan.io/issues/2108

Docs here: http://docs.pulpproject.org/en/nightly/plugins/pulp_puppet/tech-reference/plugin_conf.html#install-distributor
Release notes here: http://docs.pulpproject.org/en/nightly/plugins/pulp_puppet/user-guide/release-notes/2.13.x.html#pulp-puppet-2-13-0

Validate subdir creation

  1. pulp-admin puppet repo create --repo-id=repo1
  2. pulp-admin puppet repo uploads upload --file puppetlabs-apache-1.11.0.tar.gz --repo-id repo1
  3. curl --user admin:admin --data '{"distributor_type_id":"puppet_install_distributor", "distributor_id":"puppet_tmp_install_distributor", "distributor_config": {"install_path":"/etc/puppet/environments/MYENV", "subdir":"modules"}}' https://dev.example.com/pulp/api/v2/repositories/repo1/distributors/
  4. curl --user admin:admin --data '{"id": "puppet_tmp_install_distributor"}' https://dev.example.com/pulp/api/v2/repositories/repo1/actions/publish/
  5. validate that /etc/puppet/environments/MYENV/modules exists
  6. pulp-admin puppet repo delete --repo-id=repo1
  7. validate that /etc/puppet/environments has no directory MYENV in it

Validate update existing repo with subdir

  1. pulp-admin puppet repo create --repo-id=repo2
  2. pulp-admin puppet repo uploads upload --file puppetlabs-apache-1.11.0.tar.gz --repo-id repo2
  3. curl --user admin:admin --data '{"distributor_type_id":"puppet_install_distributor", "distributor_id":"puppet_tmp_install_distributor", "distributor_config": {"install_path":"/etc/puppet/environments/MYENV2/modules"}}' https://dev.example.com/pulp/api/v2/repositories/repo2/distributors/
  4. curl -X PUT --user admin:admin --data '{"distributor_configs": {"puppet_tmp_install_distributor": {"install_path":"/etc/puppet/environments/MYENV2", "subdir": "modules2"}}}' https://dev.example.com/pulp/api/v2/repositories/repo2/
  5. curl --user admin:admin --data '{"id": "puppet_tmp_install_distributor"}' https://dev.example.com/pulp/api/v2/repositories/repo2/actions/publish/
  6. validate that /etc/puppet/environments/MYENV2/modules2 exists
  7. pulp-admin puppet repo delete --repo-id=repo2
  8. validate that /etc/puppet/environments has no directory MYENV2 in it

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.