report.html

Report generated on 20-Oct-2020 at 21:56:48 by pytest-html v2.1.1

Environment

389-ds-base 1.4.3.12-1.fc32
Packages {"pluggy": "0.13.1", "py": "1.9.0", "pytest": "5.4.3"}
Platform Linux-5.7.7-200.fc32.x86_64-x86_64-with-glibc2.2.5
Plugins {"html": "2.1.1", "libfaketime": "0.1.2", "metadata": "1.10.0"}
Python 3.8.6
cyrus-sasl 2.1.27-4.fc32
nspr 4.29.0-1.fc32
nss 3.57.0-1.fc32
openldap 2.4.47-5.fc32

Summary

1764 tests ran in 9965.81 seconds.

1709 passed, 52 skipped, 31 failed, 274 errors, 15 expected failures, 9 unexpected passes

Results

Result Test Duration Links
Error suites/acl/acl_test.py::test_aci_attr_subtype_targetattr[lang-ja]::setup 36.74
request = <SubRequest 'topology_m2' for <Function test_aci_attr_subtype_targetattr[lang-ja]>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f8ac95e0>
from_instance = <lib389.DirSrv object at 0x7fa5f819a8b0>
to_instance = <lib389.DirSrv object at 0x7fa5f8f9bb50>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:162 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e72a458d-253e-4f8c-8de0-99e039c1c005 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e72a458d-253e-4f8c-8de0-99e039c1c005 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e72a458d-253e-4f8c-8de0-99e039c1c005 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e72a458d-253e-4f8c-8de0-99e039c1c005 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e72a458d-253e-4f8c-8de0-99e039c1c005 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e72a458d-253e-4f8c-8de0-99e039c1c005 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e72a458d-253e-4f8c-8de0-99e039c1c005 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e72a458d-253e-4f8c-8de0-99e039c1c005 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e72a458d-253e-4f8c-8de0-99e039c1c005 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e72a458d-253e-4f8c-8de0-99e039c1c005 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e72a458d-253e-4f8c-8de0-99e039c1c005 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e72a458d-253e-4f8c-8de0-99e039c1c005 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e72a458d-253e-4f8c-8de0-99e039c1c005 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e72a458d-253e-4f8c-8de0-99e039c1c005 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e72a458d-253e-4f8c-8de0-99e039c1c005 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e72a458d-253e-4f8c-8de0-99e039c1c005 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e72a458d-253e-4f8c-8de0-99e039c1c005 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e72a458d-253e-4f8c-8de0-99e039c1c005 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e72a458d-253e-4f8c-8de0-99e039c1c005 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e72a458d-253e-4f8c-8de0-99e039c1c005 / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e72a458d-253e-4f8c-8de0-99e039c1c005 / got description=None)
Error suites/acl/acl_test.py::test_aci_attr_subtype_targetattr[binary]::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_aci_attr_subtype_targetattr[lang-ja]>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f8ac95e0>
from_instance = <lib389.DirSrv object at 0x7fa5f819a8b0>
to_instance = <lib389.DirSrv object at 0x7fa5f8f9bb50>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/acl/acl_test.py::test_aci_attr_subtype_targetattr[phonetic]::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_aci_attr_subtype_targetattr[lang-ja]>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f8ac95e0>
from_instance = <lib389.DirSrv object at 0x7fa5f819a8b0>
to_instance = <lib389.DirSrv object at 0x7fa5f8f9bb50>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/acl/acl_test.py::test_mode_default_add_deny::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_aci_attr_subtype_targetattr[lang-ja]>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f8ac95e0>
from_instance = <lib389.DirSrv object at 0x7fa5f819a8b0>
to_instance = <lib389.DirSrv object at 0x7fa5f8f9bb50>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/acl/acl_test.py::test_mode_default_delete_deny::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_aci_attr_subtype_targetattr[lang-ja]>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f8ac95e0>
from_instance = <lib389.DirSrv object at 0x7fa5f819a8b0>
to_instance = <lib389.DirSrv object at 0x7fa5f8f9bb50>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/acl/acl_test.py::test_moddn_staging_prod[0-cn=staged user,dc=example,dc=com-cn=accounts,dc=example,dc=com-False]::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_aci_attr_subtype_targetattr[lang-ja]>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f8ac95e0>
from_instance = <lib389.DirSrv object at 0x7fa5f819a8b0>
to_instance = <lib389.DirSrv object at 0x7fa5f8f9bb50>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/acl/acl_test.py::test_moddn_staging_prod[1-cn=staged user,dc=example,dc=com-cn=accounts,dc=example,dc=com-False]::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_aci_attr_subtype_targetattr[lang-ja]>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f8ac95e0>
from_instance = <lib389.DirSrv object at 0x7fa5f819a8b0>
to_instance = <lib389.DirSrv object at 0x7fa5f8f9bb50>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/acl/acl_test.py::test_moddn_staging_prod[2-cn=staged user,dc=example,dc=com-cn=bad*,dc=example,dc=com-True]::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_aci_attr_subtype_targetattr[lang-ja]>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f8ac95e0>
from_instance = <lib389.DirSrv object at 0x7fa5f819a8b0>
to_instance = <lib389.DirSrv object at 0x7fa5f8f9bb50>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/acl/acl_test.py::test_moddn_staging_prod[3-cn=st*,dc=example,dc=com-cn=accounts,dc=example,dc=com-False]::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_aci_attr_subtype_targetattr[lang-ja]>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f8ac95e0>
from_instance = <lib389.DirSrv object at 0x7fa5f819a8b0>
to_instance = <lib389.DirSrv object at 0x7fa5f8f9bb50>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/acl/acl_test.py::test_moddn_staging_prod[4-cn=bad*,dc=example,dc=com-cn=accounts,dc=example,dc=com-True]::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_aci_attr_subtype_targetattr[lang-ja]>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f8ac95e0>
from_instance = <lib389.DirSrv object at 0x7fa5f819a8b0>
to_instance = <lib389.DirSrv object at 0x7fa5f8f9bb50>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/acl/acl_test.py::test_moddn_staging_prod[5-cn=st*,dc=example,dc=com-cn=ac*,dc=example,dc=com-False]::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_aci_attr_subtype_targetattr[lang-ja]>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f8ac95e0>
from_instance = <lib389.DirSrv object at 0x7fa5f819a8b0>
to_instance = <lib389.DirSrv object at 0x7fa5f8f9bb50>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/acl/acl_test.py::test_moddn_staging_prod[6-None-cn=ac*,dc=example,dc=com-False]::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_aci_attr_subtype_targetattr[lang-ja]>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f8ac95e0>
from_instance = <lib389.DirSrv object at 0x7fa5f819a8b0>
to_instance = <lib389.DirSrv object at 0x7fa5f8f9bb50>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/acl/acl_test.py::test_moddn_staging_prod[7-cn=st*,dc=example,dc=com-None-False]::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_aci_attr_subtype_targetattr[lang-ja]>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f8ac95e0>
from_instance = <lib389.DirSrv object at 0x7fa5f819a8b0>
to_instance = <lib389.DirSrv object at 0x7fa5f8f9bb50>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/acl/acl_test.py::test_moddn_staging_prod[8-None-None-False]::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_aci_attr_subtype_targetattr[lang-ja]>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f8ac95e0>
from_instance = <lib389.DirSrv object at 0x7fa5f819a8b0>
to_instance = <lib389.DirSrv object at 0x7fa5f8f9bb50>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/acl/acl_test.py::test_moddn_staging_prod_9::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_aci_attr_subtype_targetattr[lang-ja]>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f8ac95e0>
from_instance = <lib389.DirSrv object at 0x7fa5f819a8b0>
to_instance = <lib389.DirSrv object at 0x7fa5f8f9bb50>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/acl/acl_test.py::test_moddn_prod_staging::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_aci_attr_subtype_targetattr[lang-ja]>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f8ac95e0>
from_instance = <lib389.DirSrv object at 0x7fa5f819a8b0>
to_instance = <lib389.DirSrv object at 0x7fa5f8f9bb50>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/acl/acl_test.py::test_check_repl_M2_to_M1::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_aci_attr_subtype_targetattr[lang-ja]>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f8ac95e0>
from_instance = <lib389.DirSrv object at 0x7fa5f819a8b0>
to_instance = <lib389.DirSrv object at 0x7fa5f8f9bb50>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/acl/acl_test.py::test_moddn_staging_prod_except::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_aci_attr_subtype_targetattr[lang-ja]>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f8ac95e0>
from_instance = <lib389.DirSrv object at 0x7fa5f819a8b0>
to_instance = <lib389.DirSrv object at 0x7fa5f8f9bb50>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/acl/acl_test.py::test_mode_default_ger_no_moddn::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_aci_attr_subtype_targetattr[lang-ja]>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f8ac95e0>
from_instance = <lib389.DirSrv object at 0x7fa5f819a8b0>
to_instance = <lib389.DirSrv object at 0x7fa5f8f9bb50>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/acl/acl_test.py::test_mode_default_ger_with_moddn::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_aci_attr_subtype_targetattr[lang-ja]>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f8ac95e0>
from_instance = <lib389.DirSrv object at 0x7fa5f819a8b0>
to_instance = <lib389.DirSrv object at 0x7fa5f8f9bb50>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/acl/acl_test.py::test_mode_legacy_ger_no_moddn1::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_aci_attr_subtype_targetattr[lang-ja]>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f8ac95e0>
from_instance = <lib389.DirSrv object at 0x7fa5f819a8b0>
to_instance = <lib389.DirSrv object at 0x7fa5f8f9bb50>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/acl/acl_test.py::test_mode_legacy_ger_no_moddn2::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_aci_attr_subtype_targetattr[lang-ja]>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f8ac95e0>
from_instance = <lib389.DirSrv object at 0x7fa5f819a8b0>
to_instance = <lib389.DirSrv object at 0x7fa5f8f9bb50>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/acl/acl_test.py::test_mode_legacy_ger_with_moddn::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_aci_attr_subtype_targetattr[lang-ja]>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f8ac95e0>
from_instance = <lib389.DirSrv object at 0x7fa5f819a8b0>
to_instance = <lib389.DirSrv object at 0x7fa5f8f9bb50>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/acl/acl_test.py::test_rdn_write_get_ger::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_aci_attr_subtype_targetattr[lang-ja]>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f8ac95e0>
from_instance = <lib389.DirSrv object at 0x7fa5f819a8b0>
to_instance = <lib389.DirSrv object at 0x7fa5f8f9bb50>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/acl/acl_test.py::test_rdn_write_modrdn_anonymous::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_aci_attr_subtype_targetattr[lang-ja]>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f8ac95e0>
from_instance = <lib389.DirSrv object at 0x7fa5f819a8b0>
to_instance = <lib389.DirSrv object at 0x7fa5f8f9bb50>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/basic/basic_test.py::test_dscreate_ldapi::setup 0.52
request = <SubRequest 'dscreate_long_instance' for <Function test_dscreate_ldapi>>

@pytest.fixture(scope="function")
def dscreate_long_instance(request):
template_file = "/tmp/dssetup.inf"
longname_serverid = "test-longname-deadbeef-deadbeef-deadbeef-deadbeef-deadbeef"
template_text = """[general]
config_version = 2
# This invalid hostname ...
full_machine_name = localhost.localdomain
# Means we absolutely require this.
strict_host_checking = False
# In tests, we can be run in containers, NEVER trust
# that systemd is there, or functional in any capacity
systemd = False

[slapd]
instance_name = %s
root_dn = cn=directory manager
root_password = someLongPassword_123
# We do not have access to high ports in containers,
# so default to something higher.
port = 38999
secure_port = 63699


[backend-userroot]
suffix = dc=example,dc=com
sample_entries = yes
""" % longname_serverid

with open(template_file, "w") as template_fd:
template_fd.write(template_text)

# Unset PYTHONPATH to avoid mixing old CLI tools and new lib389
tmp_env = os.environ
if "PYTHONPATH" in tmp_env:
del tmp_env["PYTHONPATH"]
try:
> subprocess.check_call([
'dscreate',
'from-file',
template_file
], env=tmp_env)

suites/basic/basic_test.py:1312:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

popenargs = (['dscreate', 'from-file', '/tmp/dssetup.inf'],)
kwargs = {'env': environ({'SHELL': '/bin/bash', 'DS_SKIP_LABEL': '1', '__INTERNAL_PERSISTENT_DATA': '/var/tmp/beakerlib-8592800...st', 'FAKETIME_DID_REEXEC': 'true', 'PYTEST_CURRENT_TEST': 'suites/basic/basic_test.py::test_dscreate_ldapi (setup)'})}
retcode = 1, cmd = ['dscreate', 'from-file', '/tmp/dssetup.inf']

def check_call(*popenargs, **kwargs):
"""Run command with arguments. Wait for command to complete. If
the exit code was zero then return, otherwise raise
CalledProcessError. The CalledProcessError object will have the
return code in the returncode attribute.

The arguments are the same as for the call function. Example:

check_call(["ls", "-l"])
"""
retcode = call(*popenargs, **kwargs)
if retcode:
cmd = kwargs.get("args")
if cmd is None:
cmd = popenargs[0]
> raise CalledProcessError(retcode, cmd)
E subprocess.CalledProcessError: Command '['dscreate', 'from-file', '/tmp/dssetup.inf']' returned non-zero exit status 1.

/usr/lib64/python3.8/subprocess.py:364: CalledProcessError

During handling of the above exception, another exception occurred:

request = <SubRequest 'dscreate_long_instance' for <Function test_dscreate_ldapi>>

@pytest.fixture(scope="function")
def dscreate_long_instance(request):
template_file = "/tmp/dssetup.inf"
longname_serverid = "test-longname-deadbeef-deadbeef-deadbeef-deadbeef-deadbeef"
template_text = """[general]
config_version = 2
# This invalid hostname ...
full_machine_name = localhost.localdomain
# Means we absolutely require this.
strict_host_checking = False
# In tests, we can be run in containers, NEVER trust
# that systemd is there, or functional in any capacity
systemd = False

[slapd]
instance_name = %s
root_dn = cn=directory manager
root_password = someLongPassword_123
# We do not have access to high ports in containers,
# so default to something higher.
port = 38999
secure_port = 63699


[backend-userroot]
suffix = dc=example,dc=com
sample_entries = yes
""" % longname_serverid

with open(template_file, "w") as template_fd:
template_fd.write(template_text)

# Unset PYTHONPATH to avoid mixing old CLI tools and new lib389
tmp_env = os.environ
if "PYTHONPATH" in tmp_env:
del tmp_env["PYTHONPATH"]
try:
subprocess.check_call([
'dscreate',
'from-file',
template_file
], env=tmp_env)
except subprocess.CalledProcessError as e:
log.fatal("dscreate failed! Error ({}) {}".format(e.returncode, e.output))
> assert False
E assert False

suites/basic/basic_test.py:1319: AssertionError
-----------------------------Captured stdout setup------------------------------
Starting installation... Error: port 38999 is already in use, or missing NET_BIND_SERVICE
-------------------------------Captured log setup-------------------------------
CRITICAL tests.suites.basic.basic_test:basic_test.py:1318 dscreate failed! Error (1) None
Error suites/basic/basic_test.py::test_dscreate_multiple_dashes_name::setup 0.51
request = <SubRequest 'dscreate_long_instance' for <Function test_dscreate_multiple_dashes_name>>

@pytest.fixture(scope="function")
def dscreate_long_instance(request):
template_file = "/tmp/dssetup.inf"
longname_serverid = "test-longname-deadbeef-deadbeef-deadbeef-deadbeef-deadbeef"
template_text = """[general]
config_version = 2
# This invalid hostname ...
full_machine_name = localhost.localdomain
# Means we absolutely require this.
strict_host_checking = False
# In tests, we can be run in containers, NEVER trust
# that systemd is there, or functional in any capacity
systemd = False

[slapd]
instance_name = %s
root_dn = cn=directory manager
root_password = someLongPassword_123
# We do not have access to high ports in containers,
# so default to something higher.
port = 38999
secure_port = 63699


[backend-userroot]
suffix = dc=example,dc=com
sample_entries = yes
""" % longname_serverid

with open(template_file, "w") as template_fd:
template_fd.write(template_text)

# Unset PYTHONPATH to avoid mixing old CLI tools and new lib389
tmp_env = os.environ
if "PYTHONPATH" in tmp_env:
del tmp_env["PYTHONPATH"]
try:
> subprocess.check_call([
'dscreate',
'from-file',
template_file
], env=tmp_env)

suites/basic/basic_test.py:1312:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

popenargs = (['dscreate', 'from-file', '/tmp/dssetup.inf'],)
kwargs = {'env': environ({'SHELL': '/bin/bash', 'DS_SKIP_LABEL': '1', '__INTERNAL_PERSISTENT_DATA': '/var/tmp/beakerlib-8592800...DID_REEXEC': 'true', 'PYTEST_CURRENT_TEST': 'suites/basic/basic_test.py::test_dscreate_multiple_dashes_name (setup)'})}
retcode = 1, cmd = ['dscreate', 'from-file', '/tmp/dssetup.inf']

def check_call(*popenargs, **kwargs):
"""Run command with arguments. Wait for command to complete. If
the exit code was zero then return, otherwise raise
CalledProcessError. The CalledProcessError object will have the
return code in the returncode attribute.

The arguments are the same as for the call function. Example:

check_call(["ls", "-l"])
"""
retcode = call(*popenargs, **kwargs)
if retcode:
cmd = kwargs.get("args")
if cmd is None:
cmd = popenargs[0]
> raise CalledProcessError(retcode, cmd)
E subprocess.CalledProcessError: Command '['dscreate', 'from-file', '/tmp/dssetup.inf']' returned non-zero exit status 1.

/usr/lib64/python3.8/subprocess.py:364: CalledProcessError

During handling of the above exception, another exception occurred:

request = <SubRequest 'dscreate_long_instance' for <Function test_dscreate_multiple_dashes_name>>

@pytest.fixture(scope="function")
def dscreate_long_instance(request):
template_file = "/tmp/dssetup.inf"
longname_serverid = "test-longname-deadbeef-deadbeef-deadbeef-deadbeef-deadbeef"
template_text = """[general]
config_version = 2
# This invalid hostname ...
full_machine_name = localhost.localdomain
# Means we absolutely require this.
strict_host_checking = False
# In tests, we can be run in containers, NEVER trust
# that systemd is there, or functional in any capacity
systemd = False

[slapd]
instance_name = %s
root_dn = cn=directory manager
root_password = someLongPassword_123
# We do not have access to high ports in containers,
# so default to something higher.
port = 38999
secure_port = 63699


[backend-userroot]
suffix = dc=example,dc=com
sample_entries = yes
""" % longname_serverid

with open(template_file, "w") as template_fd:
template_fd.write(template_text)

# Unset PYTHONPATH to avoid mixing old CLI tools and new lib389
tmp_env = os.environ
if "PYTHONPATH" in tmp_env:
del tmp_env["PYTHONPATH"]
try:
subprocess.check_call([
'dscreate',
'from-file',
template_file
], env=tmp_env)
except subprocess.CalledProcessError as e:
log.fatal("dscreate failed! Error ({}) {}".format(e.returncode, e.output))
> assert False
E assert False

suites/basic/basic_test.py:1319: AssertionError
-----------------------------Captured stdout setup------------------------------
Starting installation... Error: port 38999 is already in use, or missing NET_BIND_SERVICE
-------------------------------Captured log setup-------------------------------
CRITICAL tests.suites.basic.basic_test:basic_test.py:1318 dscreate failed! Error (1) None
Error suites/clu/repl_monitor_test.py::test_dsconf_replication_monitor::setup 37.62
request = <SubRequest 'topology_m2' for <Function test_dsconf_replication_monitor>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f6d33370>
from_instance = <lib389.DirSrv object at 0x7fa5f83e9d90>
to_instance = <lib389.DirSrv object at 0x7fa5f8333550>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master2 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:162 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b27a6932-3fca-483a-8431-85744bb3cef6 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b27a6932-3fca-483a-8431-85744bb3cef6 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b27a6932-3fca-483a-8431-85744bb3cef6 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b27a6932-3fca-483a-8431-85744bb3cef6 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b27a6932-3fca-483a-8431-85744bb3cef6 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b27a6932-3fca-483a-8431-85744bb3cef6 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b27a6932-3fca-483a-8431-85744bb3cef6 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b27a6932-3fca-483a-8431-85744bb3cef6 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b27a6932-3fca-483a-8431-85744bb3cef6 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b27a6932-3fca-483a-8431-85744bb3cef6 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b27a6932-3fca-483a-8431-85744bb3cef6 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b27a6932-3fca-483a-8431-85744bb3cef6 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b27a6932-3fca-483a-8431-85744bb3cef6 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b27a6932-3fca-483a-8431-85744bb3cef6 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b27a6932-3fca-483a-8431-85744bb3cef6 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b27a6932-3fca-483a-8431-85744bb3cef6 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b27a6932-3fca-483a-8431-85744bb3cef6 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b27a6932-3fca-483a-8431-85744bb3cef6 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b27a6932-3fca-483a-8431-85744bb3cef6 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b27a6932-3fca-483a-8431-85744bb3cef6 / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b27a6932-3fca-483a-8431-85744bb3cef6 / got description=None)
Error suites/config/config_test.py::test_maxbersize_repl::setup 41.47
request = <SubRequest 'topology_m2' for <Function test_maxbersize_repl>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f6ea7700>
from_instance = <lib389.DirSrv object at 0x7fa5f6e81af0>
to_instance = <lib389.DirSrv object at 0x7fa5f829d0d0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-master2 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:162 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 894b6d5f-2296-4a14-9abd-6ccd6831daab / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 894b6d5f-2296-4a14-9abd-6ccd6831daab / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 894b6d5f-2296-4a14-9abd-6ccd6831daab / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 894b6d5f-2296-4a14-9abd-6ccd6831daab / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 894b6d5f-2296-4a14-9abd-6ccd6831daab / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 894b6d5f-2296-4a14-9abd-6ccd6831daab / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 894b6d5f-2296-4a14-9abd-6ccd6831daab / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 894b6d5f-2296-4a14-9abd-6ccd6831daab / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 894b6d5f-2296-4a14-9abd-6ccd6831daab / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 894b6d5f-2296-4a14-9abd-6ccd6831daab / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 894b6d5f-2296-4a14-9abd-6ccd6831daab / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 894b6d5f-2296-4a14-9abd-6ccd6831daab / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 894b6d5f-2296-4a14-9abd-6ccd6831daab / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 894b6d5f-2296-4a14-9abd-6ccd6831daab / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 894b6d5f-2296-4a14-9abd-6ccd6831daab / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 894b6d5f-2296-4a14-9abd-6ccd6831daab / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 894b6d5f-2296-4a14-9abd-6ccd6831daab / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 894b6d5f-2296-4a14-9abd-6ccd6831daab / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 894b6d5f-2296-4a14-9abd-6ccd6831daab / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 894b6d5f-2296-4a14-9abd-6ccd6831daab / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 894b6d5f-2296-4a14-9abd-6ccd6831daab / got description=None)
Error suites/config/config_test.py::test_config_listen_backport_size::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_maxbersize_repl>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f6ea7700>
from_instance = <lib389.DirSrv object at 0x7fa5f6e81af0>
to_instance = <lib389.DirSrv object at 0x7fa5f829d0d0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/config/config_test.py::test_config_deadlock_policy::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_maxbersize_repl>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f6ea7700>
from_instance = <lib389.DirSrv object at 0x7fa5f6e81af0>
to_instance = <lib389.DirSrv object at 0x7fa5f829d0d0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/ds_tools/replcheck_test.py::test_state::setup 39.75
request = <SubRequest 'topo' for <Function test_state>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f6ed9220>
from_instance = <lib389.DirSrv object at 0x7fa5f6d59760>
to_instance = <lib389.DirSrv object at 0x7fa5f6d57dc0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-master2 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:162 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 635be265-de8d-464f-970a-92abd3d0a21b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 635be265-de8d-464f-970a-92abd3d0a21b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 635be265-de8d-464f-970a-92abd3d0a21b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 635be265-de8d-464f-970a-92abd3d0a21b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 635be265-de8d-464f-970a-92abd3d0a21b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 635be265-de8d-464f-970a-92abd3d0a21b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 635be265-de8d-464f-970a-92abd3d0a21b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 635be265-de8d-464f-970a-92abd3d0a21b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 635be265-de8d-464f-970a-92abd3d0a21b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 635be265-de8d-464f-970a-92abd3d0a21b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 635be265-de8d-464f-970a-92abd3d0a21b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 635be265-de8d-464f-970a-92abd3d0a21b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 635be265-de8d-464f-970a-92abd3d0a21b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 635be265-de8d-464f-970a-92abd3d0a21b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 635be265-de8d-464f-970a-92abd3d0a21b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 635be265-de8d-464f-970a-92abd3d0a21b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 635be265-de8d-464f-970a-92abd3d0a21b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 635be265-de8d-464f-970a-92abd3d0a21b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 635be265-de8d-464f-970a-92abd3d0a21b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 635be265-de8d-464f-970a-92abd3d0a21b / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 635be265-de8d-464f-970a-92abd3d0a21b / got description=None)
Error suites/ds_tools/replcheck_test.py::test_check_ruv::setup 0.00
request = <SubRequest 'topo' for <Function test_state>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f6ed9220>
from_instance = <lib389.DirSrv object at 0x7fa5f6d59760>
to_instance = <lib389.DirSrv object at 0x7fa5f6d57dc0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/ds_tools/replcheck_test.py::test_missing_entries::setup 0.00
request = <SubRequest 'topo' for <Function test_state>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f6ed9220>
from_instance = <lib389.DirSrv object at 0x7fa5f6d59760>
to_instance = <lib389.DirSrv object at 0x7fa5f6d57dc0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/ds_tools/replcheck_test.py::test_tombstones::setup 0.00
request = <SubRequest 'topo' for <Function test_state>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f6ed9220>
from_instance = <lib389.DirSrv object at 0x7fa5f6d59760>
to_instance = <lib389.DirSrv object at 0x7fa5f6d57dc0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/ds_tools/replcheck_test.py::test_conflict_entries::setup 0.00
request = <SubRequest 'topo' for <Function test_state>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f6ed9220>
from_instance = <lib389.DirSrv object at 0x7fa5f6d59760>
to_instance = <lib389.DirSrv object at 0x7fa5f6d57dc0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/ds_tools/replcheck_test.py::test_inconsistencies::setup 0.00
request = <SubRequest 'topo' for <Function test_state>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f6ed9220>
from_instance = <lib389.DirSrv object at 0x7fa5f6d59760>
to_instance = <lib389.DirSrv object at 0x7fa5f6d57dc0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/ds_tools/replcheck_test.py::test_suffix_exists::setup 0.00
request = <SubRequest 'topo' for <Function test_state>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f6ed9220>
from_instance = <lib389.DirSrv object at 0x7fa5f6d59760>
to_instance = <lib389.DirSrv object at 0x7fa5f6d57dc0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/ds_tools/replcheck_test.py::test_check_missing_tombstones::setup 0.00
request = <SubRequest 'topo' for <Function test_state>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f6ed9220>
from_instance = <lib389.DirSrv object at 0x7fa5f6d59760>
to_instance = <lib389.DirSrv object at 0x7fa5f6d57dc0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/ds_tools/replcheck_test.py::test_dsreplcheck_with_password_file::setup 0.00
request = <SubRequest 'topo' for <Function test_state>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f6ed9220>
from_instance = <lib389.DirSrv object at 0x7fa5f6d59760>
to_instance = <lib389.DirSrv object at 0x7fa5f6d57dc0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/ds_tools/replcheck_test.py::test_dsreplcheck_timeout_connection_mechanisms::setup 0.00
request = <SubRequest 'topo' for <Function test_state>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f6ed9220>
from_instance = <lib389.DirSrv object at 0x7fa5f6d59760>
to_instance = <lib389.DirSrv object at 0x7fa5f6d57dc0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/dynamic_plugins/dynamic_plugins_test.py::test_acceptance::setup 42.10
request = <SubRequest 'topology_m2' for <Function test_acceptance>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f710a1f0>
from_instance = <lib389.DirSrv object at 0x7fa5f72a6d60>
to_instance = <lib389.DirSrv object at 0x7fa5f712adf0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-master2 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:162 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 6f6227fa-7b7e-44c8-bfee-e2a0627e8d07 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 6f6227fa-7b7e-44c8-bfee-e2a0627e8d07 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 6f6227fa-7b7e-44c8-bfee-e2a0627e8d07 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 6f6227fa-7b7e-44c8-bfee-e2a0627e8d07 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 6f6227fa-7b7e-44c8-bfee-e2a0627e8d07 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 6f6227fa-7b7e-44c8-bfee-e2a0627e8d07 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 6f6227fa-7b7e-44c8-bfee-e2a0627e8d07 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 6f6227fa-7b7e-44c8-bfee-e2a0627e8d07 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 6f6227fa-7b7e-44c8-bfee-e2a0627e8d07 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 6f6227fa-7b7e-44c8-bfee-e2a0627e8d07 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 6f6227fa-7b7e-44c8-bfee-e2a0627e8d07 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 6f6227fa-7b7e-44c8-bfee-e2a0627e8d07 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 6f6227fa-7b7e-44c8-bfee-e2a0627e8d07 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 6f6227fa-7b7e-44c8-bfee-e2a0627e8d07 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 6f6227fa-7b7e-44c8-bfee-e2a0627e8d07 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 6f6227fa-7b7e-44c8-bfee-e2a0627e8d07 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 6f6227fa-7b7e-44c8-bfee-e2a0627e8d07 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 6f6227fa-7b7e-44c8-bfee-e2a0627e8d07 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 6f6227fa-7b7e-44c8-bfee-e2a0627e8d07 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 6f6227fa-7b7e-44c8-bfee-e2a0627e8d07 / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 6f6227fa-7b7e-44c8-bfee-e2a0627e8d07 / got description=None)
Error suites/dynamic_plugins/dynamic_plugins_test.py::test_memory_corruption::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_acceptance>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f710a1f0>
from_instance = <lib389.DirSrv object at 0x7fa5f72a6d60>
to_instance = <lib389.DirSrv object at 0x7fa5f712adf0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/dynamic_plugins/dynamic_plugins_test.py::test_stress::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_acceptance>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f710a1f0>
from_instance = <lib389.DirSrv object at 0x7fa5f72a6d60>
to_instance = <lib389.DirSrv object at 0x7fa5f712adf0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/fourwaymmr/fourwaymmr_test.py::test_verify_trees::setup 48.66
request = <SubRequest 'topo_m4' for <Function test_verify_trees>>

@pytest.fixture(scope="module")
def topology_m4(request):
"""Create Replication Deployment with four masters"""

> topology = create_topology({ReplicaRole.MASTER: 4})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:471:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f6a00fa0>
from_instance = <lib389.DirSrv object at 0x7fa5f98d9130>
to_instance = <lib389.DirSrv object at 0x7fa5f67c4a00>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-master2 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39003, 'ldap-secureport': 63703, 'server-id': 'master3', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39004, 'ldap-secureport': 63704, 'server-id': 'master4', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:162 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect bc9be63c-139f-452b-adcb-07a7471e0246 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect bc9be63c-139f-452b-adcb-07a7471e0246 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect bc9be63c-139f-452b-adcb-07a7471e0246 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect bc9be63c-139f-452b-adcb-07a7471e0246 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect bc9be63c-139f-452b-adcb-07a7471e0246 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect bc9be63c-139f-452b-adcb-07a7471e0246 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect bc9be63c-139f-452b-adcb-07a7471e0246 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect bc9be63c-139f-452b-adcb-07a7471e0246 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect bc9be63c-139f-452b-adcb-07a7471e0246 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect bc9be63c-139f-452b-adcb-07a7471e0246 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect bc9be63c-139f-452b-adcb-07a7471e0246 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect bc9be63c-139f-452b-adcb-07a7471e0246 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect bc9be63c-139f-452b-adcb-07a7471e0246 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect bc9be63c-139f-452b-adcb-07a7471e0246 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect bc9be63c-139f-452b-adcb-07a7471e0246 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect bc9be63c-139f-452b-adcb-07a7471e0246 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect bc9be63c-139f-452b-adcb-07a7471e0246 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect bc9be63c-139f-452b-adcb-07a7471e0246 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect bc9be63c-139f-452b-adcb-07a7471e0246 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect bc9be63c-139f-452b-adcb-07a7471e0246 / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect bc9be63c-139f-452b-adcb-07a7471e0246 / got description=None)
Error suites/fourwaymmr/fourwaymmr_test.py::test_sync_through_to_all_4_masters::setup 0.00
request = <SubRequest 'topo_m4' for <Function test_verify_trees>>

@pytest.fixture(scope="module")
def topology_m4(request):
"""Create Replication Deployment with four masters"""

> topology = create_topology({ReplicaRole.MASTER: 4})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:471:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f6a00fa0>
from_instance = <lib389.DirSrv object at 0x7fa5f98d9130>
to_instance = <lib389.DirSrv object at 0x7fa5f67c4a00>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/fourwaymmr/fourwaymmr_test.py::test_modify_some_data_in_m3::setup 0.00
request = <SubRequest 'topo_m4' for <Function test_verify_trees>>

@pytest.fixture(scope="module")
def topology_m4(request):
"""Create Replication Deployment with four masters"""

> topology = create_topology({ReplicaRole.MASTER: 4})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:471:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f6a00fa0>
from_instance = <lib389.DirSrv object at 0x7fa5f98d9130>
to_instance = <lib389.DirSrv object at 0x7fa5f67c4a00>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/fourwaymmr/fourwaymmr_test.py::test_delete_a_few_entries_in_m4::setup 0.00
request = <SubRequest 'topo_m4' for <Function test_verify_trees>>

@pytest.fixture(scope="module")
def topology_m4(request):
"""Create Replication Deployment with four masters"""

> topology = create_topology({ReplicaRole.MASTER: 4})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:471:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f6a00fa0>
from_instance = <lib389.DirSrv object at 0x7fa5f98d9130>
to_instance = <lib389.DirSrv object at 0x7fa5f67c4a00>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/fourwaymmr/fourwaymmr_test.py::test_replicated_multivalued_entries::setup 0.00
request = <SubRequest 'topo_m4' for <Function test_verify_trees>>

@pytest.fixture(scope="module")
def topology_m4(request):
"""Create Replication Deployment with four masters"""

> topology = create_topology({ReplicaRole.MASTER: 4})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:471:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f6a00fa0>
from_instance = <lib389.DirSrv object at 0x7fa5f98d9130>
to_instance = <lib389.DirSrv object at 0x7fa5f67c4a00>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/fourwaymmr/fourwaymmr_test.py::test_bad_replication_agreement::setup 0.00
request = <SubRequest 'topo_m4' for <Function test_verify_trees>>

@pytest.fixture(scope="module")
def topology_m4(request):
"""Create Replication Deployment with four masters"""

> topology = create_topology({ReplicaRole.MASTER: 4})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:471:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f6a00fa0>
from_instance = <lib389.DirSrv object at 0x7fa5f98d9130>
to_instance = <lib389.DirSrv object at 0x7fa5f67c4a00>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/fourwaymmr/fourwaymmr_test.py::test_nsds5replicaenabled_verify::setup 0.00
request = <SubRequest 'topo_m4' for <Function test_verify_trees>>

@pytest.fixture(scope="module")
def topology_m4(request):
"""Create Replication Deployment with four masters"""

> topology = create_topology({ReplicaRole.MASTER: 4})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:471:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f6a00fa0>
from_instance = <lib389.DirSrv object at 0x7fa5f98d9130>
to_instance = <lib389.DirSrv object at 0x7fa5f67c4a00>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/fourwaymmr/fourwaymmr_test.py::test_create_an_entry_on_the_supplier::setup 0.00
request = <SubRequest 'topo_m4' for <Function test_verify_trees>>

@pytest.fixture(scope="module")
def topology_m4(request):
"""Create Replication Deployment with four masters"""

> topology = create_topology({ReplicaRole.MASTER: 4})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:471:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f6a00fa0>
from_instance = <lib389.DirSrv object at 0x7fa5f98d9130>
to_instance = <lib389.DirSrv object at 0x7fa5f67c4a00>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/fourwaymmr/fourwaymmr_test.py::test_bob_acceptance_tests::setup 0.00
request = <SubRequest 'topo_m4' for <Function test_verify_trees>>

@pytest.fixture(scope="module")
def topology_m4(request):
"""Create Replication Deployment with four masters"""

> topology = create_topology({ReplicaRole.MASTER: 4})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:471:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f6a00fa0>
from_instance = <lib389.DirSrv object at 0x7fa5f98d9130>
to_instance = <lib389.DirSrv object at 0x7fa5f67c4a00>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/fourwaymmr/fourwaymmr_test.py::test_replica_backup_and_restore::setup 0.00
request = <SubRequest 'topo_m4' for <Function test_verify_trees>>

@pytest.fixture(scope="module")
def topology_m4(request):
"""Create Replication Deployment with four masters"""

> topology = create_topology({ReplicaRole.MASTER: 4})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:471:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f6a00fa0>
from_instance = <lib389.DirSrv object at 0x7fa5f98d9130>
to_instance = <lib389.DirSrv object at 0x7fa5f67c4a00>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/fractional/fractional_test.py::test_fractional_agreements::setup 48.40
request = <SubRequest 'topology_m2c2' for <Function test_fractional_agreements>>

@pytest.fixture(scope="module")
def topology_m2c2(request):
"""Create Replication Deployment with two masters and two consumers"""

> topology = create_topology({ReplicaRole.MASTER: 2,
ReplicaRole.CONSUMER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:489:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f67661f0>
from_instance = <lib389.DirSrv object at 0x7fa5f6a9da90>
to_instance = <lib389.DirSrv object at 0x7fa5f722bb50>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-master2 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39201, 'ldap-secureport': 63901, 'server-id': 'consumer1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39202, 'ldap-secureport': 63902, 'server-id': 'consumer2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:162 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 37b57cc8-bbba-463f-afb0-569a6ee3dcf5 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 37b57cc8-bbba-463f-afb0-569a6ee3dcf5 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 37b57cc8-bbba-463f-afb0-569a6ee3dcf5 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 37b57cc8-bbba-463f-afb0-569a6ee3dcf5 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 37b57cc8-bbba-463f-afb0-569a6ee3dcf5 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 37b57cc8-bbba-463f-afb0-569a6ee3dcf5 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 37b57cc8-bbba-463f-afb0-569a6ee3dcf5 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 37b57cc8-bbba-463f-afb0-569a6ee3dcf5 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 37b57cc8-bbba-463f-afb0-569a6ee3dcf5 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 37b57cc8-bbba-463f-afb0-569a6ee3dcf5 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 37b57cc8-bbba-463f-afb0-569a6ee3dcf5 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 37b57cc8-bbba-463f-afb0-569a6ee3dcf5 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 37b57cc8-bbba-463f-afb0-569a6ee3dcf5 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 37b57cc8-bbba-463f-afb0-569a6ee3dcf5 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 37b57cc8-bbba-463f-afb0-569a6ee3dcf5 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 37b57cc8-bbba-463f-afb0-569a6ee3dcf5 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 37b57cc8-bbba-463f-afb0-569a6ee3dcf5 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 37b57cc8-bbba-463f-afb0-569a6ee3dcf5 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 37b57cc8-bbba-463f-afb0-569a6ee3dcf5 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 37b57cc8-bbba-463f-afb0-569a6ee3dcf5 / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 37b57cc8-bbba-463f-afb0-569a6ee3dcf5 / got description=None)
Error suites/fractional/fractional_test.py::test_read_only_consumer::setup 0.00
request = <SubRequest 'topology_m2c2' for <Function test_fractional_agreements>>

@pytest.fixture(scope="module")
def topology_m2c2(request):
"""Create Replication Deployment with two masters and two consumers"""

> topology = create_topology({ReplicaRole.MASTER: 2,
ReplicaRole.CONSUMER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:489:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f67661f0>
from_instance = <lib389.DirSrv object at 0x7fa5f6a9da90>
to_instance = <lib389.DirSrv object at 0x7fa5f722bb50>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/fractional/fractional_test.py::test_read_write_supplier::setup 0.00
request = <SubRequest 'topology_m2c2' for <Function test_fractional_agreements>>

@pytest.fixture(scope="module")
def topology_m2c2(request):
"""Create Replication Deployment with two masters and two consumers"""

> topology = create_topology({ReplicaRole.MASTER: 2,
ReplicaRole.CONSUMER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:489:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f67661f0>
from_instance = <lib389.DirSrv object at 0x7fa5f6a9da90>
to_instance = <lib389.DirSrv object at 0x7fa5f722bb50>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/fractional/fractional_test.py::test_filtered_attributes::setup 0.00
request = <SubRequest 'topology_m2c2' for <Function test_fractional_agreements>>

@pytest.fixture(scope="module")
def topology_m2c2(request):
"""Create Replication Deployment with two masters and two consumers"""

> topology = create_topology({ReplicaRole.MASTER: 2,
ReplicaRole.CONSUMER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:489:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f67661f0>
from_instance = <lib389.DirSrv object at 0x7fa5f6a9da90>
to_instance = <lib389.DirSrv object at 0x7fa5f722bb50>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/fractional/fractional_test.py::test_fewer_changes_in_single_operation::setup 0.00
request = <SubRequest 'topology_m2c2' for <Function test_fractional_agreements>>

@pytest.fixture(scope="module")
def topology_m2c2(request):
"""Create Replication Deployment with two masters and two consumers"""

> topology = create_topology({ReplicaRole.MASTER: 2,
ReplicaRole.CONSUMER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:489:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f67661f0>
from_instance = <lib389.DirSrv object at 0x7fa5f6a9da90>
to_instance = <lib389.DirSrv object at 0x7fa5f722bb50>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/fractional/fractional_test.py::test_newly_added_attribute_nsds5replicatedattributelisttotal::setup 0.00
request = <SubRequest 'topology_m2c2' for <Function test_fractional_agreements>>

@pytest.fixture(scope="module")
def topology_m2c2(request):
"""Create Replication Deployment with two masters and two consumers"""

> topology = create_topology({ReplicaRole.MASTER: 2,
ReplicaRole.CONSUMER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:489:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f67661f0>
from_instance = <lib389.DirSrv object at 0x7fa5f6a9da90>
to_instance = <lib389.DirSrv object at 0x7fa5f722bb50>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/fractional/fractional_test.py::test_attribute_nsds5replicatedattributelisttotal::setup 0.00
request = <SubRequest 'topology_m2c2' for <Function test_fractional_agreements>>

@pytest.fixture(scope="module")
def topology_m2c2(request):
"""Create Replication Deployment with two masters and two consumers"""

> topology = create_topology({ReplicaRole.MASTER: 2,
ReplicaRole.CONSUMER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:489:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f67661f0>
from_instance = <lib389.DirSrv object at 0x7fa5f6a9da90>
to_instance = <lib389.DirSrv object at 0x7fa5f722bb50>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/fractional/fractional_test.py::test_implicit_replication_of_password_policy::setup 0.00
request = <SubRequest 'topology_m2c2' for <Function test_fractional_agreements>>

@pytest.fixture(scope="module")
def topology_m2c2(request):
"""Create Replication Deployment with two masters and two consumers"""

> topology = create_topology({ReplicaRole.MASTER: 2,
ReplicaRole.CONSUMER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:489:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f67661f0>
from_instance = <lib389.DirSrv object at 0x7fa5f6a9da90>
to_instance = <lib389.DirSrv object at 0x7fa5f722bb50>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/gssapi_repl/gssapi_repl_test.py::test_gssapi_repl::setup 39.36
request = <SubRequest 'topology_m2' for <Function test_gssapi_repl>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f6739820>
from_instance = <lib389.DirSrv object at 0x7fa5f6710d00>
to_instance = <lib389.DirSrv object at 0x7fa5f670d2b0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-master2 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:162 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 101b3104-f597-4253-9f57-6965f47a0bc4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 101b3104-f597-4253-9f57-6965f47a0bc4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 101b3104-f597-4253-9f57-6965f47a0bc4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 101b3104-f597-4253-9f57-6965f47a0bc4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 101b3104-f597-4253-9f57-6965f47a0bc4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 101b3104-f597-4253-9f57-6965f47a0bc4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 101b3104-f597-4253-9f57-6965f47a0bc4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 101b3104-f597-4253-9f57-6965f47a0bc4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 101b3104-f597-4253-9f57-6965f47a0bc4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 101b3104-f597-4253-9f57-6965f47a0bc4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 101b3104-f597-4253-9f57-6965f47a0bc4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 101b3104-f597-4253-9f57-6965f47a0bc4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 101b3104-f597-4253-9f57-6965f47a0bc4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 101b3104-f597-4253-9f57-6965f47a0bc4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 101b3104-f597-4253-9f57-6965f47a0bc4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 101b3104-f597-4253-9f57-6965f47a0bc4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 101b3104-f597-4253-9f57-6965f47a0bc4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 101b3104-f597-4253-9f57-6965f47a0bc4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 101b3104-f597-4253-9f57-6965f47a0bc4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 101b3104-f597-4253-9f57-6965f47a0bc4 / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 101b3104-f597-4253-9f57-6965f47a0bc4 / got description=None)
Error suites/lib389/idm/user_compare_m2Repl_test.py::test_user_compare_m2Repl::setup 39.82
request = <SubRequest 'topology_m2' for <Function test_user_compare_m2Repl>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f649aee0>
from_instance = <lib389.DirSrv object at 0x7fa5f654bd30>
to_instance = <lib389.DirSrv object at 0x7fa5f655a220>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-master2 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:162 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9242dfb5-a408-4f56-af0a-e2a430286bd7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9242dfb5-a408-4f56-af0a-e2a430286bd7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9242dfb5-a408-4f56-af0a-e2a430286bd7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9242dfb5-a408-4f56-af0a-e2a430286bd7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9242dfb5-a408-4f56-af0a-e2a430286bd7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9242dfb5-a408-4f56-af0a-e2a430286bd7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9242dfb5-a408-4f56-af0a-e2a430286bd7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9242dfb5-a408-4f56-af0a-e2a430286bd7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9242dfb5-a408-4f56-af0a-e2a430286bd7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9242dfb5-a408-4f56-af0a-e2a430286bd7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9242dfb5-a408-4f56-af0a-e2a430286bd7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9242dfb5-a408-4f56-af0a-e2a430286bd7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9242dfb5-a408-4f56-af0a-e2a430286bd7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9242dfb5-a408-4f56-af0a-e2a430286bd7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9242dfb5-a408-4f56-af0a-e2a430286bd7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9242dfb5-a408-4f56-af0a-e2a430286bd7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9242dfb5-a408-4f56-af0a-e2a430286bd7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9242dfb5-a408-4f56-af0a-e2a430286bd7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9242dfb5-a408-4f56-af0a-e2a430286bd7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9242dfb5-a408-4f56-af0a-e2a430286bd7 / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9242dfb5-a408-4f56-af0a-e2a430286bd7 / got description=None)
Error suites/mapping_tree/referral_during_tot_init_test.py::test_referral_during_tot::setup 37.76
request = <SubRequest 'topology_m2' for <Function test_referral_during_tot>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f66e6670>
from_instance = <lib389.DirSrv object at 0x7fa5f66f54c0>
to_instance = <lib389.DirSrv object at 0x7fa5f654cc40>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master2 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:162 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8151fd0c-550c-474b-8fe6-1a26e6700d05 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8151fd0c-550c-474b-8fe6-1a26e6700d05 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8151fd0c-550c-474b-8fe6-1a26e6700d05 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8151fd0c-550c-474b-8fe6-1a26e6700d05 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8151fd0c-550c-474b-8fe6-1a26e6700d05 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8151fd0c-550c-474b-8fe6-1a26e6700d05 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8151fd0c-550c-474b-8fe6-1a26e6700d05 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8151fd0c-550c-474b-8fe6-1a26e6700d05 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8151fd0c-550c-474b-8fe6-1a26e6700d05 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8151fd0c-550c-474b-8fe6-1a26e6700d05 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8151fd0c-550c-474b-8fe6-1a26e6700d05 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8151fd0c-550c-474b-8fe6-1a26e6700d05 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8151fd0c-550c-474b-8fe6-1a26e6700d05 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8151fd0c-550c-474b-8fe6-1a26e6700d05 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8151fd0c-550c-474b-8fe6-1a26e6700d05 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8151fd0c-550c-474b-8fe6-1a26e6700d05 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8151fd0c-550c-474b-8fe6-1a26e6700d05 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8151fd0c-550c-474b-8fe6-1a26e6700d05 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8151fd0c-550c-474b-8fe6-1a26e6700d05 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8151fd0c-550c-474b-8fe6-1a26e6700d05 / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8151fd0c-550c-474b-8fe6-1a26e6700d05 / got description=None)
Error suites/memberof_plugin/regression_test.py::test_memberof_with_repl::setup 44.99
request = <SubRequest 'topo' for <Function test_memberof_with_repl>>

@pytest.fixture(scope="module")
def topology_m1h1c1(request):
"""Create Replication Deployment with one master, one consumer and one hub"""

topo_roles = {ReplicaRole.MASTER: 1, ReplicaRole.HUB: 1, ReplicaRole.CONSUMER: 1}
topology = _create_instances(topo_roles, DEFAULT_SUFFIX)
master = topology.ms["master1"]
hub = topology.hs["hub1"]
consumer = topology.cs["consumer1"]

# Start with the master, and create it "first".
log.info("Creating replication topology.")
# Now get the first master ready.
repl = ReplicationManager(DEFAULT_SUFFIX)
repl.create_first_master(master)
# Finish the topology creation
> repl.join_hub(master, hub)

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:520:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2226: in join_hub
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f65bb610>
from_instance = <lib389.DirSrv object at 0x7fa5f66e10a0>
to_instance = <lib389.DirSrv object at 0x7fa5f66e1100>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-consumer1 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39101, 'ldap-secureport': 63801, 'server-id': 'hub1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39201, 'ldap-secureport': 63901, 'server-id': 'consumer1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:515 Creating replication topology. INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 043fc90a-8c3a-43a4-9b3c-b7bb970bdd4c / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 043fc90a-8c3a-43a4-9b3c-b7bb970bdd4c / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 043fc90a-8c3a-43a4-9b3c-b7bb970bdd4c / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 043fc90a-8c3a-43a4-9b3c-b7bb970bdd4c / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 043fc90a-8c3a-43a4-9b3c-b7bb970bdd4c / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 043fc90a-8c3a-43a4-9b3c-b7bb970bdd4c / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 043fc90a-8c3a-43a4-9b3c-b7bb970bdd4c / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 043fc90a-8c3a-43a4-9b3c-b7bb970bdd4c / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 043fc90a-8c3a-43a4-9b3c-b7bb970bdd4c / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 043fc90a-8c3a-43a4-9b3c-b7bb970bdd4c / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 043fc90a-8c3a-43a4-9b3c-b7bb970bdd4c / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 043fc90a-8c3a-43a4-9b3c-b7bb970bdd4c / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 043fc90a-8c3a-43a4-9b3c-b7bb970bdd4c / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 043fc90a-8c3a-43a4-9b3c-b7bb970bdd4c / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 043fc90a-8c3a-43a4-9b3c-b7bb970bdd4c / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 043fc90a-8c3a-43a4-9b3c-b7bb970bdd4c / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 043fc90a-8c3a-43a4-9b3c-b7bb970bdd4c / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 043fc90a-8c3a-43a4-9b3c-b7bb970bdd4c / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 043fc90a-8c3a-43a4-9b3c-b7bb970bdd4c / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 043fc90a-8c3a-43a4-9b3c-b7bb970bdd4c / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 043fc90a-8c3a-43a4-9b3c-b7bb970bdd4c / got description=None)
Error suites/memberof_plugin/regression_test.py::test_scheme_violation_errors_logged::setup 39.83
request = <SubRequest 'topo_m2' for <Function test_scheme_violation_errors_logged>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f66d8040>
from_instance = <lib389.DirSrv object at 0x7fa5f66f0070>
to_instance = <lib389.DirSrv object at 0x7fa5f66e42b0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-master2 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:162 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 2d0c577c-5a21-4fae-a436-ba06f06f719a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 2d0c577c-5a21-4fae-a436-ba06f06f719a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 2d0c577c-5a21-4fae-a436-ba06f06f719a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 2d0c577c-5a21-4fae-a436-ba06f06f719a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 2d0c577c-5a21-4fae-a436-ba06f06f719a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 2d0c577c-5a21-4fae-a436-ba06f06f719a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 2d0c577c-5a21-4fae-a436-ba06f06f719a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 2d0c577c-5a21-4fae-a436-ba06f06f719a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 2d0c577c-5a21-4fae-a436-ba06f06f719a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 2d0c577c-5a21-4fae-a436-ba06f06f719a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 2d0c577c-5a21-4fae-a436-ba06f06f719a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 2d0c577c-5a21-4fae-a436-ba06f06f719a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 2d0c577c-5a21-4fae-a436-ba06f06f719a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 2d0c577c-5a21-4fae-a436-ba06f06f719a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 2d0c577c-5a21-4fae-a436-ba06f06f719a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 2d0c577c-5a21-4fae-a436-ba06f06f719a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 2d0c577c-5a21-4fae-a436-ba06f06f719a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 2d0c577c-5a21-4fae-a436-ba06f06f719a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 2d0c577c-5a21-4fae-a436-ba06f06f719a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 2d0c577c-5a21-4fae-a436-ba06f06f719a / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 2d0c577c-5a21-4fae-a436-ba06f06f719a / got description=None)
Error suites/memberof_plugin/regression_test.py::test_memberof_with_changelog_reset::setup 0.00
request = <SubRequest 'topo_m2' for <Function test_scheme_violation_errors_logged>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f66d8040>
from_instance = <lib389.DirSrv object at 0x7fa5f66f0070>
to_instance = <lib389.DirSrv object at 0x7fa5f66e42b0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/plugins/entryusn_test.py::test_entryusn_after_repl_delete::setup 38.10
request = <SubRequest 'topology_m2' for <Function test_entryusn_after_repl_delete>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f57bd520>
from_instance = <lib389.DirSrv object at 0x7fa5f57dcc70>
to_instance = <lib389.DirSrv object at 0x7fa5f5abf640>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master2 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:162 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect a08fd5b2-eb25-431a-b320-8c1a65ae0fe2 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect a08fd5b2-eb25-431a-b320-8c1a65ae0fe2 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect a08fd5b2-eb25-431a-b320-8c1a65ae0fe2 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect a08fd5b2-eb25-431a-b320-8c1a65ae0fe2 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect a08fd5b2-eb25-431a-b320-8c1a65ae0fe2 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect a08fd5b2-eb25-431a-b320-8c1a65ae0fe2 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect a08fd5b2-eb25-431a-b320-8c1a65ae0fe2 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect a08fd5b2-eb25-431a-b320-8c1a65ae0fe2 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect a08fd5b2-eb25-431a-b320-8c1a65ae0fe2 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect a08fd5b2-eb25-431a-b320-8c1a65ae0fe2 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect a08fd5b2-eb25-431a-b320-8c1a65ae0fe2 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect a08fd5b2-eb25-431a-b320-8c1a65ae0fe2 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect a08fd5b2-eb25-431a-b320-8c1a65ae0fe2 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect a08fd5b2-eb25-431a-b320-8c1a65ae0fe2 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect a08fd5b2-eb25-431a-b320-8c1a65ae0fe2 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect a08fd5b2-eb25-431a-b320-8c1a65ae0fe2 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect a08fd5b2-eb25-431a-b320-8c1a65ae0fe2 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect a08fd5b2-eb25-431a-b320-8c1a65ae0fe2 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect a08fd5b2-eb25-431a-b320-8c1a65ae0fe2 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect a08fd5b2-eb25-431a-b320-8c1a65ae0fe2 / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect a08fd5b2-eb25-431a-b320-8c1a65ae0fe2 / got description=None)
Error suites/referint_plugin/rename_test.py::test_rename_large_subtree::setup 39.90
request = <SubRequest 'topology_m2' for <Function test_rename_large_subtree>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f58d1e80>
from_instance = <lib389.DirSrv object at 0x7fa5f90cf1c0>
to_instance = <lib389.DirSrv object at 0x7fa5f58bde20>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-master2 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:162 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 56d50053-1d0e-4914-85a1-2854bba147ce / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 56d50053-1d0e-4914-85a1-2854bba147ce / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 56d50053-1d0e-4914-85a1-2854bba147ce / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 56d50053-1d0e-4914-85a1-2854bba147ce / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 56d50053-1d0e-4914-85a1-2854bba147ce / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 56d50053-1d0e-4914-85a1-2854bba147ce / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 56d50053-1d0e-4914-85a1-2854bba147ce / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 56d50053-1d0e-4914-85a1-2854bba147ce / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 56d50053-1d0e-4914-85a1-2854bba147ce / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 56d50053-1d0e-4914-85a1-2854bba147ce / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 56d50053-1d0e-4914-85a1-2854bba147ce / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 56d50053-1d0e-4914-85a1-2854bba147ce / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 56d50053-1d0e-4914-85a1-2854bba147ce / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 56d50053-1d0e-4914-85a1-2854bba147ce / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 56d50053-1d0e-4914-85a1-2854bba147ce / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 56d50053-1d0e-4914-85a1-2854bba147ce / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 56d50053-1d0e-4914-85a1-2854bba147ce / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 56d50053-1d0e-4914-85a1-2854bba147ce / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 56d50053-1d0e-4914-85a1-2854bba147ce / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 56d50053-1d0e-4914-85a1-2854bba147ce / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 56d50053-1d0e-4914-85a1-2854bba147ce / got description=None)
Error suites/replication/acceptance_test.py::test_add_entry::setup 52.53
request = <SubRequest 'topo_m4' for <Function test_add_entry>>

@pytest.fixture(scope="module")
def topology_m4(request):
"""Create Replication Deployment with four masters"""

> topology = create_topology({ReplicaRole.MASTER: 4})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:471:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f58c1550>
from_instance = <lib389.DirSrv object at 0x7fa5f584e730>
to_instance = <lib389.DirSrv object at 0x7fa5f575c160>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-master2 removed. Instance slapd-master3 removed. Instance slapd-master4 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39003, 'ldap-secureport': 63703, 'server-id': 'master3', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39004, 'ldap-secureport': 63704, 'server-id': 'master4', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:162 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 91a7af99-7015-4e43-97eb-203f3b63660f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 91a7af99-7015-4e43-97eb-203f3b63660f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 91a7af99-7015-4e43-97eb-203f3b63660f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 91a7af99-7015-4e43-97eb-203f3b63660f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 91a7af99-7015-4e43-97eb-203f3b63660f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 91a7af99-7015-4e43-97eb-203f3b63660f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 91a7af99-7015-4e43-97eb-203f3b63660f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 91a7af99-7015-4e43-97eb-203f3b63660f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 91a7af99-7015-4e43-97eb-203f3b63660f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 91a7af99-7015-4e43-97eb-203f3b63660f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 91a7af99-7015-4e43-97eb-203f3b63660f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 91a7af99-7015-4e43-97eb-203f3b63660f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 91a7af99-7015-4e43-97eb-203f3b63660f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 91a7af99-7015-4e43-97eb-203f3b63660f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 91a7af99-7015-4e43-97eb-203f3b63660f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 91a7af99-7015-4e43-97eb-203f3b63660f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 91a7af99-7015-4e43-97eb-203f3b63660f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 91a7af99-7015-4e43-97eb-203f3b63660f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 91a7af99-7015-4e43-97eb-203f3b63660f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 91a7af99-7015-4e43-97eb-203f3b63660f / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 91a7af99-7015-4e43-97eb-203f3b63660f / got description=None)
Error suites/replication/acceptance_test.py::test_modify_entry::setup 0.00
request = <SubRequest 'topo_m4' for <Function test_add_entry>>

@pytest.fixture(scope="module")
def topology_m4(request):
"""Create Replication Deployment with four masters"""

> topology = create_topology({ReplicaRole.MASTER: 4})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:471:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f58c1550>
from_instance = <lib389.DirSrv object at 0x7fa5f584e730>
to_instance = <lib389.DirSrv object at 0x7fa5f575c160>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/replication/acceptance_test.py::test_delete_entry::setup 0.00
request = <SubRequest 'topo_m4' for <Function test_add_entry>>

@pytest.fixture(scope="module")
def topology_m4(request):
"""Create Replication Deployment with four masters"""

> topology = create_topology({ReplicaRole.MASTER: 4})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:471:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f58c1550>
from_instance = <lib389.DirSrv object at 0x7fa5f584e730>
to_instance = <lib389.DirSrv object at 0x7fa5f575c160>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/replication/acceptance_test.py::test_modrdn_entry[0]::setup 0.00
request = <SubRequest 'topo_m4' for <Function test_add_entry>>

@pytest.fixture(scope="module")
def topology_m4(request):
"""Create Replication Deployment with four masters"""

> topology = create_topology({ReplicaRole.MASTER: 4})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:471:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f58c1550>
from_instance = <lib389.DirSrv object at 0x7fa5f584e730>
to_instance = <lib389.DirSrv object at 0x7fa5f575c160>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/replication/acceptance_test.py::test_modrdn_entry[1]::setup 0.00
request = <SubRequest 'topo_m4' for <Function test_add_entry>>

@pytest.fixture(scope="module")
def topology_m4(request):
"""Create Replication Deployment with four masters"""

> topology = create_topology({ReplicaRole.MASTER: 4})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:471:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f58c1550>
from_instance = <lib389.DirSrv object at 0x7fa5f584e730>
to_instance = <lib389.DirSrv object at 0x7fa5f575c160>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/replication/acceptance_test.py::test_modrdn_after_pause::setup 0.00
request = <SubRequest 'topo_m4' for <Function test_add_entry>>

@pytest.fixture(scope="module")
def topology_m4(request):
"""Create Replication Deployment with four masters"""

> topology = create_topology({ReplicaRole.MASTER: 4})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:471:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f58c1550>
from_instance = <lib389.DirSrv object at 0x7fa5f584e730>
to_instance = <lib389.DirSrv object at 0x7fa5f575c160>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/replication/acceptance_test.py::test_modify_stripattrs::setup 0.00
request = <SubRequest 'topo_m4' for <Function test_add_entry>>

@pytest.fixture(scope="module")
def topology_m4(request):
"""Create Replication Deployment with four masters"""

> topology = create_topology({ReplicaRole.MASTER: 4})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:471:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f58c1550>
from_instance = <lib389.DirSrv object at 0x7fa5f584e730>
to_instance = <lib389.DirSrv object at 0x7fa5f575c160>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/replication/acceptance_test.py::test_new_suffix::setup 0.00
request = <SubRequest 'topo_m4' for <Function test_add_entry>>

@pytest.fixture(scope="module")
def topology_m4(request):
"""Create Replication Deployment with four masters"""

> topology = create_topology({ReplicaRole.MASTER: 4})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:471:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f58c1550>
from_instance = <lib389.DirSrv object at 0x7fa5f584e730>
to_instance = <lib389.DirSrv object at 0x7fa5f575c160>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/replication/acceptance_test.py::test_many_attrs::setup 0.00
request = <SubRequest 'topo_m4' for <Function test_add_entry>>

@pytest.fixture(scope="module")
def topology_m4(request):
"""Create Replication Deployment with four masters"""

> topology = create_topology({ReplicaRole.MASTER: 4})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:471:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f58c1550>
from_instance = <lib389.DirSrv object at 0x7fa5f584e730>
to_instance = <lib389.DirSrv object at 0x7fa5f575c160>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/replication/acceptance_test.py::test_double_delete::setup 0.00
request = <SubRequest 'topo_m4' for <Function test_add_entry>>

@pytest.fixture(scope="module")
def topology_m4(request):
"""Create Replication Deployment with four masters"""

> topology = create_topology({ReplicaRole.MASTER: 4})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:471:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f58c1550>
from_instance = <lib389.DirSrv object at 0x7fa5f584e730>
to_instance = <lib389.DirSrv object at 0x7fa5f575c160>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/replication/acceptance_test.py::test_password_repl_error::setup 0.00
request = <SubRequest 'topo_m4' for <Function test_add_entry>>

@pytest.fixture(scope="module")
def topology_m4(request):
"""Create Replication Deployment with four masters"""

> topology = create_topology({ReplicaRole.MASTER: 4})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:471:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f58c1550>
from_instance = <lib389.DirSrv object at 0x7fa5f584e730>
to_instance = <lib389.DirSrv object at 0x7fa5f575c160>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/replication/acceptance_test.py::test_invalid_agmt::setup 0.00
request = <SubRequest 'topo_m4' for <Function test_add_entry>>

@pytest.fixture(scope="module")
def topology_m4(request):
"""Create Replication Deployment with four masters"""

> topology = create_topology({ReplicaRole.MASTER: 4})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:471:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f58c1550>
from_instance = <lib389.DirSrv object at 0x7fa5f584e730>
to_instance = <lib389.DirSrv object at 0x7fa5f575c160>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/replication/acceptance_test.py::test_warining_for_invalid_replica::setup 0.00
request = <SubRequest 'topo_m4' for <Function test_add_entry>>

@pytest.fixture(scope="module")
def topology_m4(request):
"""Create Replication Deployment with four masters"""

> topology = create_topology({ReplicaRole.MASTER: 4})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:471:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f58c1550>
from_instance = <lib389.DirSrv object at 0x7fa5f584e730>
to_instance = <lib389.DirSrv object at 0x7fa5f575c160>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/replication/acceptance_test.py::test_csnpurge_large_valueset::setup 39.52
request = <SubRequest 'topo_m2' for <Function test_csnpurge_large_valueset>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f5884d00>
from_instance = <lib389.DirSrv object at 0x7fa5f587d430>
to_instance = <lib389.DirSrv object at 0x7fa5f58a94c0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-master2 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:162 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect bf313809-90d1-48b7-b2c0-a5e6eca71d41 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect bf313809-90d1-48b7-b2c0-a5e6eca71d41 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect bf313809-90d1-48b7-b2c0-a5e6eca71d41 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect bf313809-90d1-48b7-b2c0-a5e6eca71d41 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect bf313809-90d1-48b7-b2c0-a5e6eca71d41 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect bf313809-90d1-48b7-b2c0-a5e6eca71d41 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect bf313809-90d1-48b7-b2c0-a5e6eca71d41 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect bf313809-90d1-48b7-b2c0-a5e6eca71d41 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect bf313809-90d1-48b7-b2c0-a5e6eca71d41 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect bf313809-90d1-48b7-b2c0-a5e6eca71d41 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect bf313809-90d1-48b7-b2c0-a5e6eca71d41 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect bf313809-90d1-48b7-b2c0-a5e6eca71d41 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect bf313809-90d1-48b7-b2c0-a5e6eca71d41 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect bf313809-90d1-48b7-b2c0-a5e6eca71d41 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect bf313809-90d1-48b7-b2c0-a5e6eca71d41 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect bf313809-90d1-48b7-b2c0-a5e6eca71d41 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect bf313809-90d1-48b7-b2c0-a5e6eca71d41 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect bf313809-90d1-48b7-b2c0-a5e6eca71d41 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect bf313809-90d1-48b7-b2c0-a5e6eca71d41 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect bf313809-90d1-48b7-b2c0-a5e6eca71d41 / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect bf313809-90d1-48b7-b2c0-a5e6eca71d41 / got description=None)
Error suites/replication/acceptance_test.py::test_urp_trigger_substring_search::setup 0.00
request = <SubRequest 'topo_m2' for <Function test_csnpurge_large_valueset>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f5884d00>
from_instance = <lib389.DirSrv object at 0x7fa5f587d430>
to_instance = <lib389.DirSrv object at 0x7fa5f58a94c0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/replication/cascading_test.py::test_basic_with_hub::setup 46.04
request = <SubRequest 'topo' for <Function test_basic_with_hub>>

@pytest.fixture(scope="module")
def topology_m1h1c1(request):
"""Create Replication Deployment with one master, one consumer and one hub"""

topo_roles = {ReplicaRole.MASTER: 1, ReplicaRole.HUB: 1, ReplicaRole.CONSUMER: 1}
topology = _create_instances(topo_roles, DEFAULT_SUFFIX)
master = topology.ms["master1"]
hub = topology.hs["hub1"]
consumer = topology.cs["consumer1"]

# Start with the master, and create it "first".
log.info("Creating replication topology.")
# Now get the first master ready.
repl = ReplicationManager(DEFAULT_SUFFIX)
repl.create_first_master(master)
# Finish the topology creation
> repl.join_hub(master, hub)

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:520:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2226: in join_hub
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f56f6100>
from_instance = <lib389.DirSrv object at 0x7fa5f56e9670>
to_instance = <lib389.DirSrv object at 0x7fa5f56e96d0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-hub1 removed. Instance slapd-consumer1 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39101, 'ldap-secureport': 63801, 'server-id': 'hub1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39201, 'ldap-secureport': 63901, 'server-id': 'consumer1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:515 Creating replication topology. INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 65418c46-15d5-4984-bf46-64cd6abdac6a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 65418c46-15d5-4984-bf46-64cd6abdac6a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 65418c46-15d5-4984-bf46-64cd6abdac6a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 65418c46-15d5-4984-bf46-64cd6abdac6a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 65418c46-15d5-4984-bf46-64cd6abdac6a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 65418c46-15d5-4984-bf46-64cd6abdac6a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 65418c46-15d5-4984-bf46-64cd6abdac6a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 65418c46-15d5-4984-bf46-64cd6abdac6a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 65418c46-15d5-4984-bf46-64cd6abdac6a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 65418c46-15d5-4984-bf46-64cd6abdac6a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 65418c46-15d5-4984-bf46-64cd6abdac6a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 65418c46-15d5-4984-bf46-64cd6abdac6a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 65418c46-15d5-4984-bf46-64cd6abdac6a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 65418c46-15d5-4984-bf46-64cd6abdac6a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 65418c46-15d5-4984-bf46-64cd6abdac6a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 65418c46-15d5-4984-bf46-64cd6abdac6a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 65418c46-15d5-4984-bf46-64cd6abdac6a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 65418c46-15d5-4984-bf46-64cd6abdac6a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 65418c46-15d5-4984-bf46-64cd6abdac6a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 65418c46-15d5-4984-bf46-64cd6abdac6a / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 65418c46-15d5-4984-bf46-64cd6abdac6a / got description=None)
Error suites/replication/changelog_encryption_test.py::test_cl_encryption_setup_process::setup 39.87
request = <SubRequest 'topo' for <Function test_cl_encryption_setup_process>>

@pytest.fixture(scope="module")
def topology_m1c1(request):
"""Create Replication Deployment with one master and one consumer"""

> topology = create_topology({ReplicaRole.MASTER: 1,
ReplicaRole.CONSUMER: 1})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:416:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:176: in create_topology
repl.join_consumer(first_master, c)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2282: in join_consumer
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f56b39d0>
from_instance = <lib389.DirSrv object at 0x7fa5f59beb50>
to_instance = <lib389.DirSrv object at 0x7fa5f59bebb0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-consumer1 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39201, 'ldap-secureport': 63901, 'server-id': 'consumer1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:175 Joining consumer consumer1 from master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 9c9ac1c3-694c-482e-837c-2aca73fe1a0a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 9c9ac1c3-694c-482e-837c-2aca73fe1a0a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 9c9ac1c3-694c-482e-837c-2aca73fe1a0a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 9c9ac1c3-694c-482e-837c-2aca73fe1a0a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 9c9ac1c3-694c-482e-837c-2aca73fe1a0a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 9c9ac1c3-694c-482e-837c-2aca73fe1a0a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 9c9ac1c3-694c-482e-837c-2aca73fe1a0a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 9c9ac1c3-694c-482e-837c-2aca73fe1a0a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 9c9ac1c3-694c-482e-837c-2aca73fe1a0a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 9c9ac1c3-694c-482e-837c-2aca73fe1a0a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 9c9ac1c3-694c-482e-837c-2aca73fe1a0a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 9c9ac1c3-694c-482e-837c-2aca73fe1a0a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 9c9ac1c3-694c-482e-837c-2aca73fe1a0a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 9c9ac1c3-694c-482e-837c-2aca73fe1a0a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 9c9ac1c3-694c-482e-837c-2aca73fe1a0a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 9c9ac1c3-694c-482e-837c-2aca73fe1a0a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 9c9ac1c3-694c-482e-837c-2aca73fe1a0a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 9c9ac1c3-694c-482e-837c-2aca73fe1a0a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 9c9ac1c3-694c-482e-837c-2aca73fe1a0a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 9c9ac1c3-694c-482e-837c-2aca73fe1a0a / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 9c9ac1c3-694c-482e-837c-2aca73fe1a0a / got description=None)
Error suites/replication/changelog_test.py::test_dsconf_dump_changelog_files_removed::setup 41.24
request = <SubRequest 'topo' for <Function test_dsconf_dump_changelog_files_removed>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f59cae50>
from_instance = <lib389.DirSrv object at 0x7fa5f5997e20>
to_instance = <lib389.DirSrv object at 0x7fa5f568c6a0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-master2 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:162 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 6526dab2-30f5-4424-860f-65473bddcdb0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 6526dab2-30f5-4424-860f-65473bddcdb0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 6526dab2-30f5-4424-860f-65473bddcdb0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 6526dab2-30f5-4424-860f-65473bddcdb0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 6526dab2-30f5-4424-860f-65473bddcdb0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 6526dab2-30f5-4424-860f-65473bddcdb0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 6526dab2-30f5-4424-860f-65473bddcdb0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 6526dab2-30f5-4424-860f-65473bddcdb0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 6526dab2-30f5-4424-860f-65473bddcdb0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 6526dab2-30f5-4424-860f-65473bddcdb0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 6526dab2-30f5-4424-860f-65473bddcdb0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 6526dab2-30f5-4424-860f-65473bddcdb0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 6526dab2-30f5-4424-860f-65473bddcdb0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 6526dab2-30f5-4424-860f-65473bddcdb0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 6526dab2-30f5-4424-860f-65473bddcdb0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 6526dab2-30f5-4424-860f-65473bddcdb0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 6526dab2-30f5-4424-860f-65473bddcdb0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 6526dab2-30f5-4424-860f-65473bddcdb0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 6526dab2-30f5-4424-860f-65473bddcdb0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 6526dab2-30f5-4424-860f-65473bddcdb0 / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 6526dab2-30f5-4424-860f-65473bddcdb0 / got description=None)
Error suites/replication/changelog_test.py::test_verify_changelog::setup 0.00
request = <SubRequest 'topo' for <Function test_dsconf_dump_changelog_files_removed>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f59cae50>
from_instance = <lib389.DirSrv object at 0x7fa5f5997e20>
to_instance = <lib389.DirSrv object at 0x7fa5f568c6a0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/replication/changelog_test.py::test_verify_changelog_online_backup::setup 0.00
request = <SubRequest 'topo' for <Function test_dsconf_dump_changelog_files_removed>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f59cae50>
from_instance = <lib389.DirSrv object at 0x7fa5f5997e20>
to_instance = <lib389.DirSrv object at 0x7fa5f568c6a0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/replication/changelog_test.py::test_verify_changelog_offline_backup::setup 0.00
request = <SubRequest 'topo' for <Function test_dsconf_dump_changelog_files_removed>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f59cae50>
from_instance = <lib389.DirSrv object at 0x7fa5f5997e20>
to_instance = <lib389.DirSrv object at 0x7fa5f568c6a0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/replication/changelog_test.py::test_changelog_maxage::setup 0.00
request = <SubRequest 'topo' for <Function test_dsconf_dump_changelog_files_removed>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f59cae50>
from_instance = <lib389.DirSrv object at 0x7fa5f5997e20>
to_instance = <lib389.DirSrv object at 0x7fa5f568c6a0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/replication/changelog_test.py::test_ticket47669_changelog_triminterval::setup 0.00
request = <SubRequest 'topo' for <Function test_dsconf_dump_changelog_files_removed>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f59cae50>
from_instance = <lib389.DirSrv object at 0x7fa5f5997e20>
to_instance = <lib389.DirSrv object at 0x7fa5f568c6a0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/replication/changelog_test.py::test_changelog_compactdbinterval::setup 0.00
request = <SubRequest 'topo' for <Function test_dsconf_dump_changelog_files_removed>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f59cae50>
from_instance = <lib389.DirSrv object at 0x7fa5f5997e20>
to_instance = <lib389.DirSrv object at 0x7fa5f568c6a0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/replication/changelog_test.py::test_retrochangelog_maxage::setup 0.00
request = <SubRequest 'topo' for <Function test_dsconf_dump_changelog_files_removed>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f59cae50>
from_instance = <lib389.DirSrv object at 0x7fa5f5997e20>
to_instance = <lib389.DirSrv object at 0x7fa5f568c6a0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/replication/changelog_test.py::test_retrochangelog_trimming_crash::setup 0.00
request = <SubRequest 'topo' for <Function test_dsconf_dump_changelog_files_removed>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f59cae50>
from_instance = <lib389.DirSrv object at 0x7fa5f5997e20>
to_instance = <lib389.DirSrv object at 0x7fa5f568c6a0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/replication/changelog_trimming_test.py::test_max_age::setup 7.51
topo = <lib389.topologies.TopologyMain object at 0x7fa5f5670370>
request = <SubRequest 'setup_max_age' for <Function test_max_age>>

@pytest.fixture(scope="module")
def setup_max_age(topo, request):
"""Configure logging and changelog max age
"""
master = topo.ms["master1"]
master.config.loglevel((ErrorLog.REPLICA,), 'error')

if ds_supports_new_changelog():
set_value(master, MAXAGE, '5')
set_value(master, TRIMINTERVAL, '300')
else:
cl = Changelog5(master)
> cl.set_max_age('5')

suites/replication/changelog_trimming_test.py:71:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/replica.py:1190: in set_max_age
self.replace('nsslapd-changelogmaxage', value)
/usr/local/lib/python3.8/site-packages/lib389/_mapped_object.py:280: in replace
self.set(key, value, action=ldap.MOD_REPLACE)
/usr/local/lib/python3.8/site-packages/lib389/_mapped_object.py:446: in set
return self._instance.modify_ext_s(self._dn, [(action, key, value)],
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:613: in modify_ext_s
resp_type, resp_data, resp_msgid, resp_ctrls = self.result3(msgid,all=1,timeout=self.timeout)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:764: in result3
resp_type, resp_data, resp_msgid, decoded_resp_ctrls, retoid, retval = self.result4(
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:774: in result4
ldap_result = self._ldap_call(self._l.result4,msgid,all,timeout,add_ctrls,add_intermediates,add_extop)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:340: in _ldap_call
reraise(exc_type, exc_value, exc_traceback)
/usr/local/lib64/python3.8/site-packages/ldap/compat.py:46: in reraise
raise exc_value
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7fa5f5670310>
func = <built-in method result4 of LDAP object at 0x7fa5f5994fc0>
args = (16, 1, -1, 0, 0, 0), kwargs = {}, diagnostic_message_success = None
exc_type = None, exc_value = None, exc_traceback = None

def _ldap_call(self,func,*args,**kwargs):
"""
Wrapper method mainly for serializing calls into OpenLDAP libs
and trace logs
"""
self._ldap_object_lock.acquire()
if __debug__:
if self._trace_level>=1:
self._trace_file.write('*** %s %s - %s\n%s\n' % (
repr(self),
self._uri,
'.'.join((self.__class__.__name__,func.__name__)),
pprint.pformat((args,kwargs))
))
if self._trace_level>=9:
traceback.print_stack(limit=self._trace_stack_limit,file=self._trace_file)
diagnostic_message_success = None
try:
try:
> result = func(*args,**kwargs)
E ldap.NO_SUCH_OBJECT: {'msgtype': 103, 'msgid': 16, 'result': 32, 'desc': 'No such object', 'ctrls': []}

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:324: NO_SUCH_OBJECT
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology.
Error suites/replication/changelog_trimming_test.py::test_max_entries::setup 0.02
topo = <lib389.topologies.TopologyMain object at 0x7fa5f5670370>
request = <SubRequest 'setup_max_entries' for <Function test_max_entries>>

@pytest.fixture(scope="module")
def setup_max_entries(topo, request):
"""Configure logging and changelog max entries
"""
master = topo.ms["master1"]

master.config.loglevel((ErrorLog.REPLICA,), 'error')

if ds_supports_new_changelog():
set_value(master, MAXENTRIES, '2')
set_value(master, TRIMINTERVAL, '300')
else:
cl = Changelog5(master)
> cl.set_trim_interval('300')

suites/replication/changelog_trimming_test.py:57:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/replica.py:1181: in set_trim_interval
self.replace('nsslapd-changelogtrim-interval', value)
/usr/local/lib/python3.8/site-packages/lib389/_mapped_object.py:280: in replace
self.set(key, value, action=ldap.MOD_REPLACE)
/usr/local/lib/python3.8/site-packages/lib389/_mapped_object.py:446: in set
return self._instance.modify_ext_s(self._dn, [(action, key, value)],
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:613: in modify_ext_s
resp_type, resp_data, resp_msgid, resp_ctrls = self.result3(msgid,all=1,timeout=self.timeout)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:764: in result3
resp_type, resp_data, resp_msgid, decoded_resp_ctrls, retoid, retval = self.result4(
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:774: in result4
ldap_result = self._ldap_call(self._l.result4,msgid,all,timeout,add_ctrls,add_intermediates,add_extop)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:340: in _ldap_call
reraise(exc_type, exc_value, exc_traceback)
/usr/local/lib64/python3.8/site-packages/ldap/compat.py:46: in reraise
raise exc_value
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7fa5f5670310>
func = <built-in method result4 of LDAP object at 0x7fa5f5994fc0>
args = (18, 1, -1, 0, 0, 0), kwargs = {}, diagnostic_message_success = None
exc_type = None, exc_value = None, exc_traceback = None

def _ldap_call(self,func,*args,**kwargs):
"""
Wrapper method mainly for serializing calls into OpenLDAP libs
and trace logs
"""
self._ldap_object_lock.acquire()
if __debug__:
if self._trace_level>=1:
self._trace_file.write('*** %s %s - %s\n%s\n' % (
repr(self),
self._uri,
'.'.join((self.__class__.__name__,func.__name__)),
pprint.pformat((args,kwargs))
))
if self._trace_level>=9:
traceback.print_stack(limit=self._trace_stack_limit,file=self._trace_file)
diagnostic_message_success = None
try:
try:
> result = func(*args,**kwargs)
E ldap.NO_SUCH_OBJECT: {'msgtype': 103, 'msgid': 18, 'result': 32, 'desc': 'No such object', 'ctrls': []}

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:324: NO_SUCH_OBJECT
Error suites/replication/cleanallruv_max_tasks_test.py::test_max_tasks::setup 52.21
request = <SubRequest 'topology_m4' for <Function test_max_tasks>>

@pytest.fixture(scope="module")
def topology_m4(request):
"""Create Replication Deployment with four masters"""

> topology = create_topology({ReplicaRole.MASTER: 4})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:471:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f5824310>
from_instance = <lib389.DirSrv object at 0x7fa5f4fa7d30>
to_instance = <lib389.DirSrv object at 0x7fa5f598b9a0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master2 removed. Instance slapd-master3 removed. Instance slapd-master4 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39003, 'ldap-secureport': 63703, 'server-id': 'master3', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39004, 'ldap-secureport': 63704, 'server-id': 'master4', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:162 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect db45a101-b7dc-4867-892d-88af5dc01ae6 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect db45a101-b7dc-4867-892d-88af5dc01ae6 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect db45a101-b7dc-4867-892d-88af5dc01ae6 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect db45a101-b7dc-4867-892d-88af5dc01ae6 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect db45a101-b7dc-4867-892d-88af5dc01ae6 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect db45a101-b7dc-4867-892d-88af5dc01ae6 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect db45a101-b7dc-4867-892d-88af5dc01ae6 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect db45a101-b7dc-4867-892d-88af5dc01ae6 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect db45a101-b7dc-4867-892d-88af5dc01ae6 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect db45a101-b7dc-4867-892d-88af5dc01ae6 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect db45a101-b7dc-4867-892d-88af5dc01ae6 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect db45a101-b7dc-4867-892d-88af5dc01ae6 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect db45a101-b7dc-4867-892d-88af5dc01ae6 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect db45a101-b7dc-4867-892d-88af5dc01ae6 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect db45a101-b7dc-4867-892d-88af5dc01ae6 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect db45a101-b7dc-4867-892d-88af5dc01ae6 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect db45a101-b7dc-4867-892d-88af5dc01ae6 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect db45a101-b7dc-4867-892d-88af5dc01ae6 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect db45a101-b7dc-4867-892d-88af5dc01ae6 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect db45a101-b7dc-4867-892d-88af5dc01ae6 / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect db45a101-b7dc-4867-892d-88af5dc01ae6 / got description=None)
Error suites/replication/cleanallruv_test.py::test_clean::setup 54.62
request = <SubRequest 'topology_m4' for <Function test_clean>>

@pytest.fixture(scope="module")
def topology_m4(request):
"""Create Replication Deployment with four masters"""

> topology = create_topology({ReplicaRole.MASTER: 4})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:471:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f5626e50>
from_instance = <lib389.DirSrv object at 0x7fa5f55acb20>
to_instance = <lib389.DirSrv object at 0x7fa5f559e8e0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-master2 removed. Instance slapd-master3 removed. Instance slapd-master4 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39003, 'ldap-secureport': 63703, 'server-id': 'master3', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39004, 'ldap-secureport': 63704, 'server-id': 'master4', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:162 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect dd945db0-bd74-4070-90a6-1db7ed20c738 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect dd945db0-bd74-4070-90a6-1db7ed20c738 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect dd945db0-bd74-4070-90a6-1db7ed20c738 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect dd945db0-bd74-4070-90a6-1db7ed20c738 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect dd945db0-bd74-4070-90a6-1db7ed20c738 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect dd945db0-bd74-4070-90a6-1db7ed20c738 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect dd945db0-bd74-4070-90a6-1db7ed20c738 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect dd945db0-bd74-4070-90a6-1db7ed20c738 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect dd945db0-bd74-4070-90a6-1db7ed20c738 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect dd945db0-bd74-4070-90a6-1db7ed20c738 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect dd945db0-bd74-4070-90a6-1db7ed20c738 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect dd945db0-bd74-4070-90a6-1db7ed20c738 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect dd945db0-bd74-4070-90a6-1db7ed20c738 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect dd945db0-bd74-4070-90a6-1db7ed20c738 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect dd945db0-bd74-4070-90a6-1db7ed20c738 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect dd945db0-bd74-4070-90a6-1db7ed20c738 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect dd945db0-bd74-4070-90a6-1db7ed20c738 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect dd945db0-bd74-4070-90a6-1db7ed20c738 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect dd945db0-bd74-4070-90a6-1db7ed20c738 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect dd945db0-bd74-4070-90a6-1db7ed20c738 / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect dd945db0-bd74-4070-90a6-1db7ed20c738 / got description=None)
Error suites/replication/cleanallruv_test.py::test_clean_restart::setup 0.00
request = <SubRequest 'topology_m4' for <Function test_clean>>

@pytest.fixture(scope="module")
def topology_m4(request):
"""Create Replication Deployment with four masters"""

> topology = create_topology({ReplicaRole.MASTER: 4})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:471:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f5626e50>
from_instance = <lib389.DirSrv object at 0x7fa5f55acb20>
to_instance = <lib389.DirSrv object at 0x7fa5f559e8e0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/replication/cleanallruv_test.py::test_clean_force::setup 0.00
request = <SubRequest 'topology_m4' for <Function test_clean>>

@pytest.fixture(scope="module")
def topology_m4(request):
"""Create Replication Deployment with four masters"""

> topology = create_topology({ReplicaRole.MASTER: 4})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:471:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f5626e50>
from_instance = <lib389.DirSrv object at 0x7fa5f55acb20>
to_instance = <lib389.DirSrv object at 0x7fa5f559e8e0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/replication/cleanallruv_test.py::test_abort::setup 0.00
request = <SubRequest 'topology_m4' for <Function test_clean>>

@pytest.fixture(scope="module")
def topology_m4(request):
"""Create Replication Deployment with four masters"""

> topology = create_topology({ReplicaRole.MASTER: 4})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:471:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f5626e50>
from_instance = <lib389.DirSrv object at 0x7fa5f55acb20>
to_instance = <lib389.DirSrv object at 0x7fa5f559e8e0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/replication/cleanallruv_test.py::test_abort_restart::setup 0.00
request = <SubRequest 'topology_m4' for <Function test_clean>>

@pytest.fixture(scope="module")
def topology_m4(request):
"""Create Replication Deployment with four masters"""

> topology = create_topology({ReplicaRole.MASTER: 4})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:471:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f5626e50>
from_instance = <lib389.DirSrv object at 0x7fa5f55acb20>
to_instance = <lib389.DirSrv object at 0x7fa5f559e8e0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/replication/cleanallruv_test.py::test_abort_certify::setup 0.00
request = <SubRequest 'topology_m4' for <Function test_clean>>

@pytest.fixture(scope="module")
def topology_m4(request):
"""Create Replication Deployment with four masters"""

> topology = create_topology({ReplicaRole.MASTER: 4})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:471:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f5626e50>
from_instance = <lib389.DirSrv object at 0x7fa5f55acb20>
to_instance = <lib389.DirSrv object at 0x7fa5f559e8e0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/replication/cleanallruv_test.py::test_stress_clean::setup 0.00
request = <SubRequest 'topology_m4' for <Function test_clean>>

@pytest.fixture(scope="module")
def topology_m4(request):
"""Create Replication Deployment with four masters"""

> topology = create_topology({ReplicaRole.MASTER: 4})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:471:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f5626e50>
from_instance = <lib389.DirSrv object at 0x7fa5f55acb20>
to_instance = <lib389.DirSrv object at 0x7fa5f559e8e0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/replication/cleanallruv_test.py::test_multiple_tasks_with_force::setup 0.00
request = <SubRequest 'topology_m4' for <Function test_clean>>

@pytest.fixture(scope="module")
def topology_m4(request):
"""Create Replication Deployment with four masters"""

> topology = create_topology({ReplicaRole.MASTER: 4})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:471:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f5626e50>
from_instance = <lib389.DirSrv object at 0x7fa5f55acb20>
to_instance = <lib389.DirSrv object at 0x7fa5f559e8e0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/replication/cleanallruv_test.py::test_clean_shutdown_crash::setup 39.49
request = <SubRequest 'topology_m2' for <Function test_clean_shutdown_crash>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f53d69a0>
from_instance = <lib389.DirSrv object at 0x7fa5f54568b0>
to_instance = <lib389.DirSrv object at 0x7fa5f53c4af0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-master2 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:162 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 28706e16-e826-469d-986c-d3d147946b1b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 28706e16-e826-469d-986c-d3d147946b1b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 28706e16-e826-469d-986c-d3d147946b1b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 28706e16-e826-469d-986c-d3d147946b1b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 28706e16-e826-469d-986c-d3d147946b1b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 28706e16-e826-469d-986c-d3d147946b1b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 28706e16-e826-469d-986c-d3d147946b1b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 28706e16-e826-469d-986c-d3d147946b1b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 28706e16-e826-469d-986c-d3d147946b1b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 28706e16-e826-469d-986c-d3d147946b1b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 28706e16-e826-469d-986c-d3d147946b1b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 28706e16-e826-469d-986c-d3d147946b1b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 28706e16-e826-469d-986c-d3d147946b1b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 28706e16-e826-469d-986c-d3d147946b1b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 28706e16-e826-469d-986c-d3d147946b1b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 28706e16-e826-469d-986c-d3d147946b1b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 28706e16-e826-469d-986c-d3d147946b1b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 28706e16-e826-469d-986c-d3d147946b1b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 28706e16-e826-469d-986c-d3d147946b1b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 28706e16-e826-469d-986c-d3d147946b1b / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 28706e16-e826-469d-986c-d3d147946b1b / got description=None)
Error suites/replication/conflict_resolve_test.py::TestTwoMasters::test_add_modrdn::setup 39.42
request = <SubRequest 'topology_m2' for <Function test_add_modrdn>>

@pytest.fixture(scope="class")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

suites/replication/conftest.py:28:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f539e850>
from_instance = <lib389.DirSrv object at 0x7fa5f538d340>
to_instance = <lib389.DirSrv object at 0x7fa5f545ef40>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-master2 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:162 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d4afb179-0557-4437-8e66-c1ba45ad4e1c / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d4afb179-0557-4437-8e66-c1ba45ad4e1c / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d4afb179-0557-4437-8e66-c1ba45ad4e1c / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d4afb179-0557-4437-8e66-c1ba45ad4e1c / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d4afb179-0557-4437-8e66-c1ba45ad4e1c / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d4afb179-0557-4437-8e66-c1ba45ad4e1c / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d4afb179-0557-4437-8e66-c1ba45ad4e1c / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d4afb179-0557-4437-8e66-c1ba45ad4e1c / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d4afb179-0557-4437-8e66-c1ba45ad4e1c / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d4afb179-0557-4437-8e66-c1ba45ad4e1c / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d4afb179-0557-4437-8e66-c1ba45ad4e1c / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d4afb179-0557-4437-8e66-c1ba45ad4e1c / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d4afb179-0557-4437-8e66-c1ba45ad4e1c / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d4afb179-0557-4437-8e66-c1ba45ad4e1c / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d4afb179-0557-4437-8e66-c1ba45ad4e1c / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d4afb179-0557-4437-8e66-c1ba45ad4e1c / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d4afb179-0557-4437-8e66-c1ba45ad4e1c / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d4afb179-0557-4437-8e66-c1ba45ad4e1c / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d4afb179-0557-4437-8e66-c1ba45ad4e1c / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d4afb179-0557-4437-8e66-c1ba45ad4e1c / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d4afb179-0557-4437-8e66-c1ba45ad4e1c / got description=None)
Error suites/replication/conflict_resolve_test.py::TestTwoMasters::test_complex_add_modify_modrdn_delete::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_add_modrdn>>

@pytest.fixture(scope="class")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

suites/replication/conftest.py:28:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f539e850>
from_instance = <lib389.DirSrv object at 0x7fa5f538d340>
to_instance = <lib389.DirSrv object at 0x7fa5f545ef40>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/replication/conflict_resolve_test.py::TestTwoMasters::test_memberof_groups::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_add_modrdn>>

@pytest.fixture(scope="class")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

suites/replication/conftest.py:28:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f539e850>
from_instance = <lib389.DirSrv object at 0x7fa5f538d340>
to_instance = <lib389.DirSrv object at 0x7fa5f545ef40>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/replication/conflict_resolve_test.py::TestTwoMasters::test_managed_entries::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_add_modrdn>>

@pytest.fixture(scope="class")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

suites/replication/conftest.py:28:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f539e850>
from_instance = <lib389.DirSrv object at 0x7fa5f538d340>
to_instance = <lib389.DirSrv object at 0x7fa5f545ef40>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/replication/conflict_resolve_test.py::TestTwoMasters::test_nested_entries_with_children::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_add_modrdn>>

@pytest.fixture(scope="class")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

suites/replication/conftest.py:28:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f539e850>
from_instance = <lib389.DirSrv object at 0x7fa5f538d340>
to_instance = <lib389.DirSrv object at 0x7fa5f545ef40>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/replication/conflict_resolve_test.py::TestTwoMasters::test_conflict_attribute_multi_valued::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_add_modrdn>>

@pytest.fixture(scope="class")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

suites/replication/conftest.py:28:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f539e850>
from_instance = <lib389.DirSrv object at 0x7fa5f538d340>
to_instance = <lib389.DirSrv object at 0x7fa5f545ef40>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/replication/conflict_resolve_test.py::TestTwoMasters::test_conflict_attribute_single_valued::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_add_modrdn>>

@pytest.fixture(scope="class")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

suites/replication/conftest.py:28:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f539e850>
from_instance = <lib389.DirSrv object at 0x7fa5f538d340>
to_instance = <lib389.DirSrv object at 0x7fa5f545ef40>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/replication/conflict_resolve_test.py::TestThreeMasters::test_nested_entries::setup 46.56
request = <SubRequest 'topology_m3' for <Function test_nested_entries>>

@pytest.fixture(scope="class")
def topology_m3(request):
"""Create Replication Deployment with three masters"""

> topology = create_topology({ReplicaRole.MASTER: 3})

suites/replication/conftest.py:44:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f5087c40>
from_instance = <lib389.DirSrv object at 0x7fa5f5341850>
to_instance = <lib389.DirSrv object at 0x7fa5f533ba00>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-master2 removed. Instance slapd-master3 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39003, 'ldap-secureport': 63703, 'server-id': 'master3', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:162 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 172546fc-f139-4e3a-8316-6a5a853b69d4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 172546fc-f139-4e3a-8316-6a5a853b69d4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 172546fc-f139-4e3a-8316-6a5a853b69d4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 172546fc-f139-4e3a-8316-6a5a853b69d4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 172546fc-f139-4e3a-8316-6a5a853b69d4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 172546fc-f139-4e3a-8316-6a5a853b69d4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 172546fc-f139-4e3a-8316-6a5a853b69d4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 172546fc-f139-4e3a-8316-6a5a853b69d4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 172546fc-f139-4e3a-8316-6a5a853b69d4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 172546fc-f139-4e3a-8316-6a5a853b69d4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 172546fc-f139-4e3a-8316-6a5a853b69d4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 172546fc-f139-4e3a-8316-6a5a853b69d4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 172546fc-f139-4e3a-8316-6a5a853b69d4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 172546fc-f139-4e3a-8316-6a5a853b69d4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 172546fc-f139-4e3a-8316-6a5a853b69d4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 172546fc-f139-4e3a-8316-6a5a853b69d4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 172546fc-f139-4e3a-8316-6a5a853b69d4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 172546fc-f139-4e3a-8316-6a5a853b69d4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 172546fc-f139-4e3a-8316-6a5a853b69d4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 172546fc-f139-4e3a-8316-6a5a853b69d4 / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 172546fc-f139-4e3a-8316-6a5a853b69d4 / got description=None)
Error suites/replication/encryption_cl5_test.py::test_algorithm_unhashed::setup 39.81
request = <SubRequest 'topology_m2' for <Function test_algorithm_unhashed>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f5067d00>
from_instance = <lib389.DirSrv object at 0x7fa5f50b3130>
to_instance = <lib389.DirSrv object at 0x7fa5f50abd30>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-master2 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:162 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 06c500c7-f3a2-410d-8553-3bb611440145 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 06c500c7-f3a2-410d-8553-3bb611440145 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 06c500c7-f3a2-410d-8553-3bb611440145 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 06c500c7-f3a2-410d-8553-3bb611440145 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 06c500c7-f3a2-410d-8553-3bb611440145 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 06c500c7-f3a2-410d-8553-3bb611440145 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 06c500c7-f3a2-410d-8553-3bb611440145 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 06c500c7-f3a2-410d-8553-3bb611440145 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 06c500c7-f3a2-410d-8553-3bb611440145 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 06c500c7-f3a2-410d-8553-3bb611440145 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 06c500c7-f3a2-410d-8553-3bb611440145 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 06c500c7-f3a2-410d-8553-3bb611440145 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 06c500c7-f3a2-410d-8553-3bb611440145 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 06c500c7-f3a2-410d-8553-3bb611440145 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 06c500c7-f3a2-410d-8553-3bb611440145 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 06c500c7-f3a2-410d-8553-3bb611440145 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 06c500c7-f3a2-410d-8553-3bb611440145 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 06c500c7-f3a2-410d-8553-3bb611440145 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 06c500c7-f3a2-410d-8553-3bb611440145 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 06c500c7-f3a2-410d-8553-3bb611440145 / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 06c500c7-f3a2-410d-8553-3bb611440145 / got description=None)
Error suites/replication/multiple_changelogs_test.py::test_multiple_changelogs::setup 39.99
request = <SubRequest 'topo' for <Function test_multiple_changelogs>>

@pytest.fixture(scope="module")
def topology_m1c1(request):
"""Create Replication Deployment with one master and one consumer"""

> topology = create_topology({ReplicaRole.MASTER: 1,
ReplicaRole.CONSUMER: 1})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:416:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:176: in create_topology
repl.join_consumer(first_master, c)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2282: in join_consumer
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f541d2e0>
from_instance = <lib389.DirSrv object at 0x7fa5f4f7da30>
to_instance = <lib389.DirSrv object at 0x7fa5f4f7da90>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-consumer1 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39201, 'ldap-secureport': 63901, 'server-id': 'consumer1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:175 Joining consumer consumer1 from master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect b26cc8bd-47ff-4955-a078-39361a07955f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect b26cc8bd-47ff-4955-a078-39361a07955f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect b26cc8bd-47ff-4955-a078-39361a07955f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect b26cc8bd-47ff-4955-a078-39361a07955f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect b26cc8bd-47ff-4955-a078-39361a07955f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect b26cc8bd-47ff-4955-a078-39361a07955f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect b26cc8bd-47ff-4955-a078-39361a07955f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect b26cc8bd-47ff-4955-a078-39361a07955f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect b26cc8bd-47ff-4955-a078-39361a07955f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect b26cc8bd-47ff-4955-a078-39361a07955f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect b26cc8bd-47ff-4955-a078-39361a07955f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect b26cc8bd-47ff-4955-a078-39361a07955f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect b26cc8bd-47ff-4955-a078-39361a07955f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect b26cc8bd-47ff-4955-a078-39361a07955f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect b26cc8bd-47ff-4955-a078-39361a07955f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect b26cc8bd-47ff-4955-a078-39361a07955f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect b26cc8bd-47ff-4955-a078-39361a07955f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect b26cc8bd-47ff-4955-a078-39361a07955f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect b26cc8bd-47ff-4955-a078-39361a07955f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect b26cc8bd-47ff-4955-a078-39361a07955f / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect b26cc8bd-47ff-4955-a078-39361a07955f / got description=None)
Error suites/replication/multiple_changelogs_test.py::test_multiple_changelogs_export_import::setup 0.00
request = <SubRequest 'topo' for <Function test_multiple_changelogs>>

@pytest.fixture(scope="module")
def topology_m1c1(request):
"""Create Replication Deployment with one master and one consumer"""

> topology = create_topology({ReplicaRole.MASTER: 1,
ReplicaRole.CONSUMER: 1})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:416:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:176: in create_topology
repl.join_consumer(first_master, c)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2282: in join_consumer
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f541d2e0>
from_instance = <lib389.DirSrv object at 0x7fa5f4f7da30>
to_instance = <lib389.DirSrv object at 0x7fa5f4f7da90>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/replication/regression_test.py::test_double_delete::setup 41.72
request = <SubRequest 'topo_m2' for <Function test_double_delete>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f50c5d30>
from_instance = <lib389.DirSrv object at 0x7fa5f504ca30>
to_instance = <lib389.DirSrv object at 0x7fa5f50c1400>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-master2 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:162 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 921e9334-767c-42be-aefd-93a4abace334 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 921e9334-767c-42be-aefd-93a4abace334 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 921e9334-767c-42be-aefd-93a4abace334 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 921e9334-767c-42be-aefd-93a4abace334 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 921e9334-767c-42be-aefd-93a4abace334 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 921e9334-767c-42be-aefd-93a4abace334 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 921e9334-767c-42be-aefd-93a4abace334 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 921e9334-767c-42be-aefd-93a4abace334 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 921e9334-767c-42be-aefd-93a4abace334 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 921e9334-767c-42be-aefd-93a4abace334 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 921e9334-767c-42be-aefd-93a4abace334 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 921e9334-767c-42be-aefd-93a4abace334 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 921e9334-767c-42be-aefd-93a4abace334 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 921e9334-767c-42be-aefd-93a4abace334 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 921e9334-767c-42be-aefd-93a4abace334 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 921e9334-767c-42be-aefd-93a4abace334 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 921e9334-767c-42be-aefd-93a4abace334 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 921e9334-767c-42be-aefd-93a4abace334 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 921e9334-767c-42be-aefd-93a4abace334 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 921e9334-767c-42be-aefd-93a4abace334 / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 921e9334-767c-42be-aefd-93a4abace334 / got description=None)
Error suites/replication/regression_test.py::test_repl_modrdn::setup 0.00
request = <SubRequest 'topo_m2' for <Function test_double_delete>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f50c5d30>
from_instance = <lib389.DirSrv object at 0x7fa5f504ca30>
to_instance = <lib389.DirSrv object at 0x7fa5f50c1400>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/replication/regression_test.py::test_password_repl_error::setup 0.00
request = <SubRequest 'topo_m2' for <Function test_double_delete>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f50c5d30>
from_instance = <lib389.DirSrv object at 0x7fa5f504ca30>
to_instance = <lib389.DirSrv object at 0x7fa5f50c1400>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/replication/regression_test.py::test_invalid_agmt::setup 0.00
request = <SubRequest 'topo_m2' for <Function test_double_delete>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f50c5d30>
from_instance = <lib389.DirSrv object at 0x7fa5f504ca30>
to_instance = <lib389.DirSrv object at 0x7fa5f50c1400>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/replication/regression_test.py::test_fetch_bindDnGroup::setup 0.00
request = <SubRequest 'topo_m2' for <Function test_double_delete>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f50c5d30>
from_instance = <lib389.DirSrv object at 0x7fa5f504ca30>
to_instance = <lib389.DirSrv object at 0x7fa5f50c1400>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/replication/regression_test.py::test_plugin_bind_dn_tracking_and_replication::setup 0.00
request = <SubRequest 'topo_m2' for <Function test_double_delete>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f50c5d30>
from_instance = <lib389.DirSrv object at 0x7fa5f504ca30>
to_instance = <lib389.DirSrv object at 0x7fa5f50c1400>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/replication/regression_test.py::test_cleanallruv_repl::setup 48.65
request = <SubRequest 'topo_m3' for <Function test_cleanallruv_repl>>

@pytest.fixture(scope="module")
def topology_m3(request):
"""Create Replication Deployment with three masters"""

> topology = create_topology({ReplicaRole.MASTER: 3})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:453:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f54c6c10>
from_instance = <lib389.DirSrv object at 0x7fa5f54c5220>
to_instance = <lib389.DirSrv object at 0x7fa5f54b9c40>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-master2 removed. Instance slapd-master3 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39003, 'ldap-secureport': 63703, 'server-id': 'master3', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:162 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8e597652-06fc-4a5b-9dae-ded30dd47a17 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8e597652-06fc-4a5b-9dae-ded30dd47a17 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8e597652-06fc-4a5b-9dae-ded30dd47a17 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8e597652-06fc-4a5b-9dae-ded30dd47a17 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8e597652-06fc-4a5b-9dae-ded30dd47a17 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8e597652-06fc-4a5b-9dae-ded30dd47a17 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8e597652-06fc-4a5b-9dae-ded30dd47a17 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8e597652-06fc-4a5b-9dae-ded30dd47a17 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8e597652-06fc-4a5b-9dae-ded30dd47a17 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8e597652-06fc-4a5b-9dae-ded30dd47a17 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8e597652-06fc-4a5b-9dae-ded30dd47a17 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8e597652-06fc-4a5b-9dae-ded30dd47a17 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8e597652-06fc-4a5b-9dae-ded30dd47a17 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8e597652-06fc-4a5b-9dae-ded30dd47a17 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8e597652-06fc-4a5b-9dae-ded30dd47a17 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8e597652-06fc-4a5b-9dae-ded30dd47a17 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8e597652-06fc-4a5b-9dae-ded30dd47a17 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8e597652-06fc-4a5b-9dae-ded30dd47a17 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8e597652-06fc-4a5b-9dae-ded30dd47a17 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8e597652-06fc-4a5b-9dae-ded30dd47a17 / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8e597652-06fc-4a5b-9dae-ded30dd47a17 / got description=None)
Error suites/replication/regression_test.py::test_online_reinit_may_hang::setup 39.58
request = <SubRequest 'topo_with_sigkill' for <Function test_online_reinit_may_hang>>

@pytest.fixture(scope="module")
def topo_with_sigkill(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

suites/replication/regression_test.py:117:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f51c4c10>
from_instance = <lib389.DirSrv object at 0x7fa5f54c6ee0>
to_instance = <lib389.DirSrv object at 0x7fa5f51bbd30>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-master2 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:162 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect ad32ef65-e47e-4bdb-a262-000681019345 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect ad32ef65-e47e-4bdb-a262-000681019345 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect ad32ef65-e47e-4bdb-a262-000681019345 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect ad32ef65-e47e-4bdb-a262-000681019345 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect ad32ef65-e47e-4bdb-a262-000681019345 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect ad32ef65-e47e-4bdb-a262-000681019345 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect ad32ef65-e47e-4bdb-a262-000681019345 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect ad32ef65-e47e-4bdb-a262-000681019345 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect ad32ef65-e47e-4bdb-a262-000681019345 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect ad32ef65-e47e-4bdb-a262-000681019345 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect ad32ef65-e47e-4bdb-a262-000681019345 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect ad32ef65-e47e-4bdb-a262-000681019345 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect ad32ef65-e47e-4bdb-a262-000681019345 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect ad32ef65-e47e-4bdb-a262-000681019345 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect ad32ef65-e47e-4bdb-a262-000681019345 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect ad32ef65-e47e-4bdb-a262-000681019345 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect ad32ef65-e47e-4bdb-a262-000681019345 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect ad32ef65-e47e-4bdb-a262-000681019345 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect ad32ef65-e47e-4bdb-a262-000681019345 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect ad32ef65-e47e-4bdb-a262-000681019345 / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect ad32ef65-e47e-4bdb-a262-000681019345 / got description=None)
Error suites/replication/regression_test.py::test_moving_entry_make_online_init_fail::setup 39.18
request = <SubRequest 'topology_m2' for <Function test_moving_entry_make_online_init_fail>>

@pytest.fixture(scope="class")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

suites/replication/conftest.py:28:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f52f17c0>
from_instance = <lib389.DirSrv object at 0x7fa5f51e9820>
to_instance = <lib389.DirSrv object at 0x7fa5f52f8130>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-master2 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:162 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c5e17377-ca2d-417a-8b1d-24e08b77328b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c5e17377-ca2d-417a-8b1d-24e08b77328b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c5e17377-ca2d-417a-8b1d-24e08b77328b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c5e17377-ca2d-417a-8b1d-24e08b77328b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c5e17377-ca2d-417a-8b1d-24e08b77328b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c5e17377-ca2d-417a-8b1d-24e08b77328b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c5e17377-ca2d-417a-8b1d-24e08b77328b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c5e17377-ca2d-417a-8b1d-24e08b77328b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c5e17377-ca2d-417a-8b1d-24e08b77328b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c5e17377-ca2d-417a-8b1d-24e08b77328b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c5e17377-ca2d-417a-8b1d-24e08b77328b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c5e17377-ca2d-417a-8b1d-24e08b77328b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c5e17377-ca2d-417a-8b1d-24e08b77328b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c5e17377-ca2d-417a-8b1d-24e08b77328b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c5e17377-ca2d-417a-8b1d-24e08b77328b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c5e17377-ca2d-417a-8b1d-24e08b77328b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c5e17377-ca2d-417a-8b1d-24e08b77328b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c5e17377-ca2d-417a-8b1d-24e08b77328b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c5e17377-ca2d-417a-8b1d-24e08b77328b / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c5e17377-ca2d-417a-8b1d-24e08b77328b / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c5e17377-ca2d-417a-8b1d-24e08b77328b / got description=None)
Error suites/replication/repl_agmt_bootstrap_test.py::test_repl_agmt_bootstrap_credentials::setup 41.90
request = <SubRequest 'topo' for <Function test_repl_agmt_bootstrap_credentials>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f54a8d30>
from_instance = <lib389.DirSrv object at 0x7fa5f54c8400>
to_instance = <lib389.DirSrv object at 0x7fa5f549c910>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-master2 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:162 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 72dc8963-96bb-4bfc-b824-2761fc0b483f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 72dc8963-96bb-4bfc-b824-2761fc0b483f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 72dc8963-96bb-4bfc-b824-2761fc0b483f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 72dc8963-96bb-4bfc-b824-2761fc0b483f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 72dc8963-96bb-4bfc-b824-2761fc0b483f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 72dc8963-96bb-4bfc-b824-2761fc0b483f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 72dc8963-96bb-4bfc-b824-2761fc0b483f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 72dc8963-96bb-4bfc-b824-2761fc0b483f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 72dc8963-96bb-4bfc-b824-2761fc0b483f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 72dc8963-96bb-4bfc-b824-2761fc0b483f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 72dc8963-96bb-4bfc-b824-2761fc0b483f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 72dc8963-96bb-4bfc-b824-2761fc0b483f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 72dc8963-96bb-4bfc-b824-2761fc0b483f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 72dc8963-96bb-4bfc-b824-2761fc0b483f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 72dc8963-96bb-4bfc-b824-2761fc0b483f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 72dc8963-96bb-4bfc-b824-2761fc0b483f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 72dc8963-96bb-4bfc-b824-2761fc0b483f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 72dc8963-96bb-4bfc-b824-2761fc0b483f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 72dc8963-96bb-4bfc-b824-2761fc0b483f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 72dc8963-96bb-4bfc-b824-2761fc0b483f / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 72dc8963-96bb-4bfc-b824-2761fc0b483f / got description=None)
Error suites/replication/ruvstore_test.py::test_ruv_entry_backup::setup 39.25
request = <SubRequest 'topo' for <Function test_ruv_entry_backup>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f52885e0>
from_instance = <lib389.DirSrv object at 0x7fa5f8accc10>
to_instance = <lib389.DirSrv object at 0x7fa5f5294ca0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-master2 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:162 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 5a8bf4e3-87de-4c94-8d44-6afa60540d8d / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 5a8bf4e3-87de-4c94-8d44-6afa60540d8d / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 5a8bf4e3-87de-4c94-8d44-6afa60540d8d / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 5a8bf4e3-87de-4c94-8d44-6afa60540d8d / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 5a8bf4e3-87de-4c94-8d44-6afa60540d8d / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 5a8bf4e3-87de-4c94-8d44-6afa60540d8d / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 5a8bf4e3-87de-4c94-8d44-6afa60540d8d / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 5a8bf4e3-87de-4c94-8d44-6afa60540d8d / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 5a8bf4e3-87de-4c94-8d44-6afa60540d8d / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 5a8bf4e3-87de-4c94-8d44-6afa60540d8d / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 5a8bf4e3-87de-4c94-8d44-6afa60540d8d / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 5a8bf4e3-87de-4c94-8d44-6afa60540d8d / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 5a8bf4e3-87de-4c94-8d44-6afa60540d8d / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 5a8bf4e3-87de-4c94-8d44-6afa60540d8d / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 5a8bf4e3-87de-4c94-8d44-6afa60540d8d / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 5a8bf4e3-87de-4c94-8d44-6afa60540d8d / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 5a8bf4e3-87de-4c94-8d44-6afa60540d8d / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 5a8bf4e3-87de-4c94-8d44-6afa60540d8d / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 5a8bf4e3-87de-4c94-8d44-6afa60540d8d / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 5a8bf4e3-87de-4c94-8d44-6afa60540d8d / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 5a8bf4e3-87de-4c94-8d44-6afa60540d8d / got description=None)
Error suites/replication/series_of_repl_bugs_test.py::test_deletions_are_not_replicated::setup 39.26
request = <SubRequest 'topo_m2' for <Function test_deletions_are_not_replicated>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f55f3b20>
from_instance = <lib389.DirSrv object at 0x7fa5f950cfa0>
to_instance = <lib389.DirSrv object at 0x7fa5f540d910>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-master2 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:162 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d1408da0-d5dd-4d54-acab-739f68369974 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d1408da0-d5dd-4d54-acab-739f68369974 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d1408da0-d5dd-4d54-acab-739f68369974 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d1408da0-d5dd-4d54-acab-739f68369974 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d1408da0-d5dd-4d54-acab-739f68369974 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d1408da0-d5dd-4d54-acab-739f68369974 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d1408da0-d5dd-4d54-acab-739f68369974 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d1408da0-d5dd-4d54-acab-739f68369974 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d1408da0-d5dd-4d54-acab-739f68369974 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d1408da0-d5dd-4d54-acab-739f68369974 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d1408da0-d5dd-4d54-acab-739f68369974 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d1408da0-d5dd-4d54-acab-739f68369974 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d1408da0-d5dd-4d54-acab-739f68369974 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d1408da0-d5dd-4d54-acab-739f68369974 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d1408da0-d5dd-4d54-acab-739f68369974 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d1408da0-d5dd-4d54-acab-739f68369974 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d1408da0-d5dd-4d54-acab-739f68369974 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d1408da0-d5dd-4d54-acab-739f68369974 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d1408da0-d5dd-4d54-acab-739f68369974 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d1408da0-d5dd-4d54-acab-739f68369974 / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d1408da0-d5dd-4d54-acab-739f68369974 / got description=None)
Error suites/replication/series_of_repl_bugs_test.py::test_error_20::setup 0.00
request = <SubRequest 'topo_m2' for <Function test_deletions_are_not_replicated>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f55f3b20>
from_instance = <lib389.DirSrv object at 0x7fa5f950cfa0>
to_instance = <lib389.DirSrv object at 0x7fa5f540d910>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/replication/series_of_repl_bugs_test.py::test_segfaults::setup 0.00
request = <SubRequest 'topo_m2' for <Function test_deletions_are_not_replicated>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f55f3b20>
from_instance = <lib389.DirSrv object at 0x7fa5f950cfa0>
to_instance = <lib389.DirSrv object at 0x7fa5f540d910>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/replication/series_of_repl_bugs_test.py::test_adding_deleting::setup 0.00
request = <SubRequest 'topo_m2' for <Function test_deletions_are_not_replicated>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f55f3b20>
from_instance = <lib389.DirSrv object at 0x7fa5f950cfa0>
to_instance = <lib389.DirSrv object at 0x7fa5f540d910>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/replication/series_of_repl_bugs_test.py::test_deleting_twice::setup 0.00
request = <SubRequest 'topo_m2' for <Function test_deletions_are_not_replicated>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f55f3b20>
from_instance = <lib389.DirSrv object at 0x7fa5f950cfa0>
to_instance = <lib389.DirSrv object at 0x7fa5f540d910>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/replication/series_of_repl_bugs_test.py::test_rename_entry::setup 0.00
request = <SubRequest 'topo_m2' for <Function test_deletions_are_not_replicated>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f55f3b20>
from_instance = <lib389.DirSrv object at 0x7fa5f950cfa0>
to_instance = <lib389.DirSrv object at 0x7fa5f540d910>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/replication/series_of_repl_bugs_test.py::test_userpassword_attribute::setup 0.00
request = <SubRequest 'topo_m2' for <Function test_deletions_are_not_replicated>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f55f3b20>
from_instance = <lib389.DirSrv object at 0x7fa5f950cfa0>
to_instance = <lib389.DirSrv object at 0x7fa5f540d910>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/replication/series_of_repl_bugs_test.py::test_tombstone_modrdn::setup 0.00
request = <SubRequest 'topo_m2' for <Function test_deletions_are_not_replicated>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f55f3b20>
from_instance = <lib389.DirSrv object at 0x7fa5f950cfa0>
to_instance = <lib389.DirSrv object at 0x7fa5f540d910>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/replication/single_master_test.py::test_mail_attr_repl::setup 40.78
request = <SubRequest 'topo_r' for <Function test_mail_attr_repl>>

@pytest.fixture(scope="module")
def topology_m1c1(request):
"""Create Replication Deployment with one master and one consumer"""

> topology = create_topology({ReplicaRole.MASTER: 1,
ReplicaRole.CONSUMER: 1})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:416:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:176: in create_topology
repl.join_consumer(first_master, c)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2282: in join_consumer
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f552d700>
from_instance = <lib389.DirSrv object at 0x7fa5f4fddf10>
to_instance = <lib389.DirSrv object at 0x7fa5f4fddf70>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-consumer1 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39201, 'ldap-secureport': 63901, 'server-id': 'consumer1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:175 Joining consumer consumer1 from master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 9d73f11d-2987-4e8f-b0ad-907b47bb8081 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 9d73f11d-2987-4e8f-b0ad-907b47bb8081 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 9d73f11d-2987-4e8f-b0ad-907b47bb8081 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 9d73f11d-2987-4e8f-b0ad-907b47bb8081 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 9d73f11d-2987-4e8f-b0ad-907b47bb8081 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 9d73f11d-2987-4e8f-b0ad-907b47bb8081 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 9d73f11d-2987-4e8f-b0ad-907b47bb8081 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 9d73f11d-2987-4e8f-b0ad-907b47bb8081 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 9d73f11d-2987-4e8f-b0ad-907b47bb8081 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 9d73f11d-2987-4e8f-b0ad-907b47bb8081 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 9d73f11d-2987-4e8f-b0ad-907b47bb8081 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 9d73f11d-2987-4e8f-b0ad-907b47bb8081 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 9d73f11d-2987-4e8f-b0ad-907b47bb8081 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 9d73f11d-2987-4e8f-b0ad-907b47bb8081 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 9d73f11d-2987-4e8f-b0ad-907b47bb8081 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 9d73f11d-2987-4e8f-b0ad-907b47bb8081 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 9d73f11d-2987-4e8f-b0ad-907b47bb8081 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 9d73f11d-2987-4e8f-b0ad-907b47bb8081 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 9d73f11d-2987-4e8f-b0ad-907b47bb8081 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 9d73f11d-2987-4e8f-b0ad-907b47bb8081 / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 9d73f11d-2987-4e8f-b0ad-907b47bb8081 / got description=None)
Error suites/replication/tls_client_auth_repl_test.py::test_ssl_transport::setup 40.60
request = <SubRequest 'topo_m2' for <Function test_ssl_transport>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f55cdbb0>
from_instance = <lib389.DirSrv object at 0x7fa5f55c2c40>
to_instance = <lib389.DirSrv object at 0x7fa5f55ca910>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-master2 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:162 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e021f093-1411-4ce9-93c6-c4afd9619fa1 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e021f093-1411-4ce9-93c6-c4afd9619fa1 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e021f093-1411-4ce9-93c6-c4afd9619fa1 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e021f093-1411-4ce9-93c6-c4afd9619fa1 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e021f093-1411-4ce9-93c6-c4afd9619fa1 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e021f093-1411-4ce9-93c6-c4afd9619fa1 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e021f093-1411-4ce9-93c6-c4afd9619fa1 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e021f093-1411-4ce9-93c6-c4afd9619fa1 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e021f093-1411-4ce9-93c6-c4afd9619fa1 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e021f093-1411-4ce9-93c6-c4afd9619fa1 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e021f093-1411-4ce9-93c6-c4afd9619fa1 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e021f093-1411-4ce9-93c6-c4afd9619fa1 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e021f093-1411-4ce9-93c6-c4afd9619fa1 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e021f093-1411-4ce9-93c6-c4afd9619fa1 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e021f093-1411-4ce9-93c6-c4afd9619fa1 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e021f093-1411-4ce9-93c6-c4afd9619fa1 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e021f093-1411-4ce9-93c6-c4afd9619fa1 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e021f093-1411-4ce9-93c6-c4afd9619fa1 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e021f093-1411-4ce9-93c6-c4afd9619fa1 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e021f093-1411-4ce9-93c6-c4afd9619fa1 / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e021f093-1411-4ce9-93c6-c4afd9619fa1 / got description=None)
Error suites/replication/tls_client_auth_repl_test.py::test_extract_pemfiles::setup 0.00
request = <SubRequest 'topo_m2' for <Function test_ssl_transport>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f55cdbb0>
from_instance = <lib389.DirSrv object at 0x7fa5f55c2c40>
to_instance = <lib389.DirSrv object at 0x7fa5f55ca910>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/replication/wait_for_async_feature_test.py::test_not_int_value::setup 40.80
request = <SubRequest 'topology_m2' for <Function test_not_int_value>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f4cac5e0>
from_instance = <lib389.DirSrv object at 0x7fa5f55714c0>
to_instance = <lib389.DirSrv object at 0x7fa5f5546be0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master2 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:162 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d2aee2f4-a588-41d2-af7c-61cc828897a2 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d2aee2f4-a588-41d2-af7c-61cc828897a2 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d2aee2f4-a588-41d2-af7c-61cc828897a2 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d2aee2f4-a588-41d2-af7c-61cc828897a2 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d2aee2f4-a588-41d2-af7c-61cc828897a2 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d2aee2f4-a588-41d2-af7c-61cc828897a2 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d2aee2f4-a588-41d2-af7c-61cc828897a2 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d2aee2f4-a588-41d2-af7c-61cc828897a2 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d2aee2f4-a588-41d2-af7c-61cc828897a2 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d2aee2f4-a588-41d2-af7c-61cc828897a2 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d2aee2f4-a588-41d2-af7c-61cc828897a2 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d2aee2f4-a588-41d2-af7c-61cc828897a2 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d2aee2f4-a588-41d2-af7c-61cc828897a2 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d2aee2f4-a588-41d2-af7c-61cc828897a2 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d2aee2f4-a588-41d2-af7c-61cc828897a2 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d2aee2f4-a588-41d2-af7c-61cc828897a2 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d2aee2f4-a588-41d2-af7c-61cc828897a2 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d2aee2f4-a588-41d2-af7c-61cc828897a2 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d2aee2f4-a588-41d2-af7c-61cc828897a2 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d2aee2f4-a588-41d2-af7c-61cc828897a2 / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d2aee2f4-a588-41d2-af7c-61cc828897a2 / got description=None)
Error suites/replication/wait_for_async_feature_test.py::test_multi_value::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_not_int_value>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f4cac5e0>
from_instance = <lib389.DirSrv object at 0x7fa5f55714c0>
to_instance = <lib389.DirSrv object at 0x7fa5f5546be0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/replication/wait_for_async_feature_test.py::test_value_check[waitfor_async_attr0]::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_not_int_value>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f4cac5e0>
from_instance = <lib389.DirSrv object at 0x7fa5f55714c0>
to_instance = <lib389.DirSrv object at 0x7fa5f5546be0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/replication/wait_for_async_feature_test.py::test_value_check[waitfor_async_attr1]::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_not_int_value>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f4cac5e0>
from_instance = <lib389.DirSrv object at 0x7fa5f55714c0>
to_instance = <lib389.DirSrv object at 0x7fa5f5546be0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/replication/wait_for_async_feature_test.py::test_value_check[waitfor_async_attr2]::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_not_int_value>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f4cac5e0>
from_instance = <lib389.DirSrv object at 0x7fa5f55714c0>
to_instance = <lib389.DirSrv object at 0x7fa5f5546be0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/replication/wait_for_async_feature_test.py::test_value_check[waitfor_async_attr3]::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_not_int_value>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f4cac5e0>
from_instance = <lib389.DirSrv object at 0x7fa5f55714c0>
to_instance = <lib389.DirSrv object at 0x7fa5f5546be0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/replication/wait_for_async_feature_test.py::test_behavior_with_value[waitfor_async_attr0]::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_not_int_value>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f4cac5e0>
from_instance = <lib389.DirSrv object at 0x7fa5f55714c0>
to_instance = <lib389.DirSrv object at 0x7fa5f5546be0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/replication/wait_for_async_feature_test.py::test_behavior_with_value[waitfor_async_attr1]::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_not_int_value>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f4cac5e0>
from_instance = <lib389.DirSrv object at 0x7fa5f55714c0>
to_instance = <lib389.DirSrv object at 0x7fa5f5546be0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/replication/wait_for_async_feature_test.py::test_behavior_with_value[waitfor_async_attr2]::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_not_int_value>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f4cac5e0>
from_instance = <lib389.DirSrv object at 0x7fa5f55714c0>
to_instance = <lib389.DirSrv object at 0x7fa5f5546be0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/replication/wait_for_async_feature_test.py::test_behavior_with_value[waitfor_async_attr3]::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_not_int_value>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f4cac5e0>
from_instance = <lib389.DirSrv object at 0x7fa5f55714c0>
to_instance = <lib389.DirSrv object at 0x7fa5f5546be0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/sasl/regression_test.py::test_openldap_no_nss_crypto::setup 39.43
request = <SubRequest 'topology_m2' for <Function test_openldap_no_nss_crypto>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f4bcaca0>
from_instance = <lib389.DirSrv object at 0x7fa5f4c98970>
to_instance = <lib389.DirSrv object at 0x7fa5f4c951c0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-master2 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:162 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect f854d787-80e9-4b3a-b251-d6a92c1c1edb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect f854d787-80e9-4b3a-b251-d6a92c1c1edb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect f854d787-80e9-4b3a-b251-d6a92c1c1edb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect f854d787-80e9-4b3a-b251-d6a92c1c1edb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect f854d787-80e9-4b3a-b251-d6a92c1c1edb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect f854d787-80e9-4b3a-b251-d6a92c1c1edb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect f854d787-80e9-4b3a-b251-d6a92c1c1edb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect f854d787-80e9-4b3a-b251-d6a92c1c1edb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect f854d787-80e9-4b3a-b251-d6a92c1c1edb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect f854d787-80e9-4b3a-b251-d6a92c1c1edb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect f854d787-80e9-4b3a-b251-d6a92c1c1edb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect f854d787-80e9-4b3a-b251-d6a92c1c1edb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect f854d787-80e9-4b3a-b251-d6a92c1c1edb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect f854d787-80e9-4b3a-b251-d6a92c1c1edb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect f854d787-80e9-4b3a-b251-d6a92c1c1edb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect f854d787-80e9-4b3a-b251-d6a92c1c1edb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect f854d787-80e9-4b3a-b251-d6a92c1c1edb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect f854d787-80e9-4b3a-b251-d6a92c1c1edb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect f854d787-80e9-4b3a-b251-d6a92c1c1edb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect f854d787-80e9-4b3a-b251-d6a92c1c1edb / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect f854d787-80e9-4b3a-b251-d6a92c1c1edb / got description=None)
Error suites/schema/schema_replication_test.py::test_schema_replication_one::setup 40.69
request = <SubRequest 'topology_m1c1' for <Function test_schema_replication_one>>

@pytest.fixture(scope="module")
def topology_m1c1(request):
"""Create Replication Deployment with one master and one consumer"""

> topology = create_topology({ReplicaRole.MASTER: 1,
ReplicaRole.CONSUMER: 1})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:416:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:176: in create_topology
repl.join_consumer(first_master, c)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2282: in join_consumer
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f4db2be0>
from_instance = <lib389.DirSrv object at 0x7fa5f4e16640>
to_instance = <lib389.DirSrv object at 0x7fa5f4e161c0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-consumer1 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39201, 'ldap-secureport': 63901, 'server-id': 'consumer1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:175 Joining consumer consumer1 from master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 0922a74d-593e-470d-bf07-dd694642dea0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 0922a74d-593e-470d-bf07-dd694642dea0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 0922a74d-593e-470d-bf07-dd694642dea0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 0922a74d-593e-470d-bf07-dd694642dea0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 0922a74d-593e-470d-bf07-dd694642dea0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 0922a74d-593e-470d-bf07-dd694642dea0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 0922a74d-593e-470d-bf07-dd694642dea0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 0922a74d-593e-470d-bf07-dd694642dea0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 0922a74d-593e-470d-bf07-dd694642dea0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 0922a74d-593e-470d-bf07-dd694642dea0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 0922a74d-593e-470d-bf07-dd694642dea0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 0922a74d-593e-470d-bf07-dd694642dea0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 0922a74d-593e-470d-bf07-dd694642dea0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 0922a74d-593e-470d-bf07-dd694642dea0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 0922a74d-593e-470d-bf07-dd694642dea0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 0922a74d-593e-470d-bf07-dd694642dea0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 0922a74d-593e-470d-bf07-dd694642dea0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 0922a74d-593e-470d-bf07-dd694642dea0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 0922a74d-593e-470d-bf07-dd694642dea0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 0922a74d-593e-470d-bf07-dd694642dea0 / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 0922a74d-593e-470d-bf07-dd694642dea0 / got description=None)
Error suites/schema/schema_replication_test.py::test_schema_replication_two::setup 0.00
request = <SubRequest 'topology_m1c1' for <Function test_schema_replication_one>>

@pytest.fixture(scope="module")
def topology_m1c1(request):
"""Create Replication Deployment with one master and one consumer"""

> topology = create_topology({ReplicaRole.MASTER: 1,
ReplicaRole.CONSUMER: 1})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:416:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:176: in create_topology
repl.join_consumer(first_master, c)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2282: in join_consumer
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f4db2be0>
from_instance = <lib389.DirSrv object at 0x7fa5f4e16640>
to_instance = <lib389.DirSrv object at 0x7fa5f4e161c0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/schema/schema_replication_test.py::test_schema_replication_three::setup 0.00
request = <SubRequest 'topology_m1c1' for <Function test_schema_replication_one>>

@pytest.fixture(scope="module")
def topology_m1c1(request):
"""Create Replication Deployment with one master and one consumer"""

> topology = create_topology({ReplicaRole.MASTER: 1,
ReplicaRole.CONSUMER: 1})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:416:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:176: in create_topology
repl.join_consumer(first_master, c)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2282: in join_consumer
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f4db2be0>
from_instance = <lib389.DirSrv object at 0x7fa5f4e16640>
to_instance = <lib389.DirSrv object at 0x7fa5f4e161c0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/schema/schema_replication_test.py::test_schema_replication_four::setup 0.00
request = <SubRequest 'topology_m1c1' for <Function test_schema_replication_one>>

@pytest.fixture(scope="module")
def topology_m1c1(request):
"""Create Replication Deployment with one master and one consumer"""

> topology = create_topology({ReplicaRole.MASTER: 1,
ReplicaRole.CONSUMER: 1})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:416:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:176: in create_topology
repl.join_consumer(first_master, c)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2282: in join_consumer
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f4db2be0>
from_instance = <lib389.DirSrv object at 0x7fa5f4e16640>
to_instance = <lib389.DirSrv object at 0x7fa5f4e161c0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/schema/schema_replication_test.py::test_schema_replication_five::setup 0.00
request = <SubRequest 'topology_m1c1' for <Function test_schema_replication_one>>

@pytest.fixture(scope="module")
def topology_m1c1(request):
"""Create Replication Deployment with one master and one consumer"""

> topology = create_topology({ReplicaRole.MASTER: 1,
ReplicaRole.CONSUMER: 1})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:416:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:176: in create_topology
repl.join_consumer(first_master, c)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2282: in join_consumer
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f4db2be0>
from_instance = <lib389.DirSrv object at 0x7fa5f4e16640>
to_instance = <lib389.DirSrv object at 0x7fa5f4e161c0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/schema/schema_replication_test.py::test_schema_replication_six::setup 0.00
request = <SubRequest 'topology_m1c1' for <Function test_schema_replication_one>>

@pytest.fixture(scope="module")
def topology_m1c1(request):
"""Create Replication Deployment with one master and one consumer"""

> topology = create_topology({ReplicaRole.MASTER: 1,
ReplicaRole.CONSUMER: 1})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:416:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:176: in create_topology
repl.join_consumer(first_master, c)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2282: in join_consumer
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f4db2be0>
from_instance = <lib389.DirSrv object at 0x7fa5f4e16640>
to_instance = <lib389.DirSrv object at 0x7fa5f4e161c0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/schema/schema_replication_test.py::test_schema_replication_seven::setup 0.00
request = <SubRequest 'topology_m1c1' for <Function test_schema_replication_one>>

@pytest.fixture(scope="module")
def topology_m1c1(request):
"""Create Replication Deployment with one master and one consumer"""

> topology = create_topology({ReplicaRole.MASTER: 1,
ReplicaRole.CONSUMER: 1})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:416:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:176: in create_topology
repl.join_consumer(first_master, c)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2282: in join_consumer
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f4db2be0>
from_instance = <lib389.DirSrv object at 0x7fa5f4e16640>
to_instance = <lib389.DirSrv object at 0x7fa5f4e161c0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/schema/schema_replication_test.py::test_schema_replication_eight::setup 0.00
request = <SubRequest 'topology_m1c1' for <Function test_schema_replication_one>>

@pytest.fixture(scope="module")
def topology_m1c1(request):
"""Create Replication Deployment with one master and one consumer"""

> topology = create_topology({ReplicaRole.MASTER: 1,
ReplicaRole.CONSUMER: 1})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:416:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:176: in create_topology
repl.join_consumer(first_master, c)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2282: in join_consumer
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f4db2be0>
from_instance = <lib389.DirSrv object at 0x7fa5f4e16640>
to_instance = <lib389.DirSrv object at 0x7fa5f4e161c0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/schema/schema_replication_test.py::test_schema_replication_nine::setup 0.00
request = <SubRequest 'topology_m1c1' for <Function test_schema_replication_one>>

@pytest.fixture(scope="module")
def topology_m1c1(request):
"""Create Replication Deployment with one master and one consumer"""

> topology = create_topology({ReplicaRole.MASTER: 1,
ReplicaRole.CONSUMER: 1})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:416:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:176: in create_topology
repl.join_consumer(first_master, c)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2282: in join_consumer
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f4db2be0>
from_instance = <lib389.DirSrv object at 0x7fa5f4e16640>
to_instance = <lib389.DirSrv object at 0x7fa5f4e161c0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/state/mmt_state_test.py::test_check_desc_attr_state[description-Test1usr1-ldap.MOD_ADD-exp_values0-vucsn]::setup 41.62
request = <SubRequest 'topo' for <Function test_check_desc_attr_state[description-Test1usr1-ldap.MOD_ADD-exp_values0-vucsn]>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f4e3a0a0>
from_instance = <lib389.DirSrv object at 0x7fa5f4e69af0>
to_instance = <lib389.DirSrv object at 0x7fa5f4e47b80>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-master2 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:162 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect a55c5068-4f09-4293-8cf3-7837d093e7c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect a55c5068-4f09-4293-8cf3-7837d093e7c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect a55c5068-4f09-4293-8cf3-7837d093e7c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect a55c5068-4f09-4293-8cf3-7837d093e7c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect a55c5068-4f09-4293-8cf3-7837d093e7c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect a55c5068-4f09-4293-8cf3-7837d093e7c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect a55c5068-4f09-4293-8cf3-7837d093e7c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect a55c5068-4f09-4293-8cf3-7837d093e7c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect a55c5068-4f09-4293-8cf3-7837d093e7c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect a55c5068-4f09-4293-8cf3-7837d093e7c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect a55c5068-4f09-4293-8cf3-7837d093e7c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect a55c5068-4f09-4293-8cf3-7837d093e7c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect a55c5068-4f09-4293-8cf3-7837d093e7c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect a55c5068-4f09-4293-8cf3-7837d093e7c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect a55c5068-4f09-4293-8cf3-7837d093e7c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect a55c5068-4f09-4293-8cf3-7837d093e7c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect a55c5068-4f09-4293-8cf3-7837d093e7c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect a55c5068-4f09-4293-8cf3-7837d093e7c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect a55c5068-4f09-4293-8cf3-7837d093e7c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect a55c5068-4f09-4293-8cf3-7837d093e7c9 / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect a55c5068-4f09-4293-8cf3-7837d093e7c9 / got description=None)
Error suites/state/mmt_state_test.py::test_check_desc_attr_state[description-Test1usr2-ldap.MOD_ADD-exp_values1-vucsn]::setup 0.00
request = <SubRequest 'topo' for <Function test_check_desc_attr_state[description-Test1usr1-ldap.MOD_ADD-exp_values0-vucsn]>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f4e3a0a0>
from_instance = <lib389.DirSrv object at 0x7fa5f4e69af0>
to_instance = <lib389.DirSrv object at 0x7fa5f4e47b80>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/state/mmt_state_test.py::test_check_desc_attr_state[description-Test1usr3-ldap.MOD_ADD-exp_values2-vucsn]::setup 0.00
request = <SubRequest 'topo' for <Function test_check_desc_attr_state[description-Test1usr1-ldap.MOD_ADD-exp_values0-vucsn]>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f4e3a0a0>
from_instance = <lib389.DirSrv object at 0x7fa5f4e69af0>
to_instance = <lib389.DirSrv object at 0x7fa5f4e47b80>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/state/mmt_state_test.py::test_check_desc_attr_state[description-Test1usr4-ldap.MOD_REPLACE-exp_values3-adcsn]::setup 0.00
request = <SubRequest 'topo' for <Function test_check_desc_attr_state[description-Test1usr1-ldap.MOD_ADD-exp_values0-vucsn]>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f4e3a0a0>
from_instance = <lib389.DirSrv object at 0x7fa5f4e69af0>
to_instance = <lib389.DirSrv object at 0x7fa5f4e47b80>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/state/mmt_state_test.py::test_check_desc_attr_state[description-Test1usr4-ldap.MOD_DELETE-exp_values4-vdcsn]::setup 0.00
request = <SubRequest 'topo' for <Function test_check_desc_attr_state[description-Test1usr1-ldap.MOD_ADD-exp_values0-vucsn]>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f4e3a0a0>
from_instance = <lib389.DirSrv object at 0x7fa5f4e69af0>
to_instance = <lib389.DirSrv object at 0x7fa5f4e47b80>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/state/mmt_state_test.py::test_check_cn_attr_state[cn-TestCN1-ldap.MOD_ADD-exp_values0-vucsn]::setup 0.00
request = <SubRequest 'topo' for <Function test_check_desc_attr_state[description-Test1usr1-ldap.MOD_ADD-exp_values0-vucsn]>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f4e3a0a0>
from_instance = <lib389.DirSrv object at 0x7fa5f4e69af0>
to_instance = <lib389.DirSrv object at 0x7fa5f4e47b80>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/state/mmt_state_test.py::test_check_cn_attr_state[cn-TestCN2-ldap.MOD_ADD-exp_values1-vucsn]::setup 0.00
request = <SubRequest 'topo' for <Function test_check_desc_attr_state[description-Test1usr1-ldap.MOD_ADD-exp_values0-vucsn]>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f4e3a0a0>
from_instance = <lib389.DirSrv object at 0x7fa5f4e69af0>
to_instance = <lib389.DirSrv object at 0x7fa5f4e47b80>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/state/mmt_state_test.py::test_check_cn_attr_state[cn-TestnewCN3-ldap.MOD_REPLACE-exp_values2-adcsn]::setup 0.00
request = <SubRequest 'topo' for <Function test_check_desc_attr_state[description-Test1usr1-ldap.MOD_ADD-exp_values0-vucsn]>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f4e3a0a0>
from_instance = <lib389.DirSrv object at 0x7fa5f4e69af0>
to_instance = <lib389.DirSrv object at 0x7fa5f4e47b80>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/state/mmt_state_test.py::test_check_cn_attr_state[cn-TestnewCN3-ldap.MOD_DELETE-None-None]::setup 0.00
request = <SubRequest 'topo' for <Function test_check_desc_attr_state[description-Test1usr1-ldap.MOD_ADD-exp_values0-vucsn]>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f4e3a0a0>
from_instance = <lib389.DirSrv object at 0x7fa5f4e69af0>
to_instance = <lib389.DirSrv object at 0x7fa5f4e47b80>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/state/mmt_state_test.py::test_check_single_value_attr_state[preferredlanguage-Chinese-ldap.MOD_REPLACE-exp_values0-vucsn]::setup 0.00
request = <SubRequest 'topo' for <Function test_check_desc_attr_state[description-Test1usr1-ldap.MOD_ADD-exp_values0-vucsn]>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f4e3a0a0>
from_instance = <lib389.DirSrv object at 0x7fa5f4e69af0>
to_instance = <lib389.DirSrv object at 0x7fa5f4e47b80>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/state/mmt_state_test.py::test_check_single_value_attr_state[preferredlanguage-French-ldap.MOD_ADD-None-None]::setup 0.00
request = <SubRequest 'topo' for <Function test_check_desc_attr_state[description-Test1usr1-ldap.MOD_ADD-exp_values0-vucsn]>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f4e3a0a0>
from_instance = <lib389.DirSrv object at 0x7fa5f4e69af0>
to_instance = <lib389.DirSrv object at 0x7fa5f4e47b80>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/state/mmt_state_test.py::test_check_single_value_attr_state[preferredlanguage-German-ldap.MOD_REPLACE-exp_values2-adcsn]::setup 0.00
request = <SubRequest 'topo' for <Function test_check_desc_attr_state[description-Test1usr1-ldap.MOD_ADD-exp_values0-vucsn]>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f4e3a0a0>
from_instance = <lib389.DirSrv object at 0x7fa5f4e69af0>
to_instance = <lib389.DirSrv object at 0x7fa5f4e47b80>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/state/mmt_state_test.py::test_check_single_value_attr_state[preferredlanguage-German-ldap.MOD_DELETE-exp_values3-vdcsn]::setup 0.00
request = <SubRequest 'topo' for <Function test_check_desc_attr_state[description-Test1usr1-ldap.MOD_ADD-exp_values0-vucsn]>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f4e3a0a0>
from_instance = <lib389.DirSrv object at 0x7fa5f4e69af0>
to_instance = <lib389.DirSrv object at 0x7fa5f4e47b80>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/state/mmt_state_test.py::test_check_subtype_attr_state[roomnumber;office-Tower1-ldap.MOD_ADD-exp_values0-vucsn]::setup 0.00
request = <SubRequest 'topo' for <Function test_check_desc_attr_state[description-Test1usr1-ldap.MOD_ADD-exp_values0-vucsn]>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f4e3a0a0>
from_instance = <lib389.DirSrv object at 0x7fa5f4e69af0>
to_instance = <lib389.DirSrv object at 0x7fa5f4e47b80>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/state/mmt_state_test.py::test_check_subtype_attr_state[roomnumber;office-Tower2-ldap.MOD_ADD-exp_values1-vucsn]::setup 0.00
request = <SubRequest 'topo' for <Function test_check_desc_attr_state[description-Test1usr1-ldap.MOD_ADD-exp_values0-vucsn]>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f4e3a0a0>
from_instance = <lib389.DirSrv object at 0x7fa5f4e69af0>
to_instance = <lib389.DirSrv object at 0x7fa5f4e47b80>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/state/mmt_state_test.py::test_check_subtype_attr_state[roomnumber;office-Tower3-ldap.MOD_ADD-exp_values2-vucsn]::setup 0.00
request = <SubRequest 'topo' for <Function test_check_desc_attr_state[description-Test1usr1-ldap.MOD_ADD-exp_values0-vucsn]>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f4e3a0a0>
from_instance = <lib389.DirSrv object at 0x7fa5f4e69af0>
to_instance = <lib389.DirSrv object at 0x7fa5f4e47b80>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/state/mmt_state_test.py::test_check_subtype_attr_state[roomnumber;office-Tower4-ldap.MOD_REPLACE-exp_values3-adcsn]::setup 0.00
request = <SubRequest 'topo' for <Function test_check_desc_attr_state[description-Test1usr1-ldap.MOD_ADD-exp_values0-vucsn]>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f4e3a0a0>
from_instance = <lib389.DirSrv object at 0x7fa5f4e69af0>
to_instance = <lib389.DirSrv object at 0x7fa5f4e47b80>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/state/mmt_state_test.py::test_check_subtype_attr_state[roomnumber;office-Tower4-ldap.MOD_DELETE-exp_values4-vucsn]::setup 0.00
request = <SubRequest 'topo' for <Function test_check_desc_attr_state[description-Test1usr1-ldap.MOD_ADD-exp_values0-vucsn]>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f4e3a0a0>
from_instance = <lib389.DirSrv object at 0x7fa5f4e69af0>
to_instance = <lib389.DirSrv object at 0x7fa5f4e47b80>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/state/mmt_state_test.py::test_check_jpeg_attr_state[jpegphoto-thedeadbeef1-ldap.MOD_ADD-exp_values0-vucsn]::setup 0.00
request = <SubRequest 'topo' for <Function test_check_desc_attr_state[description-Test1usr1-ldap.MOD_ADD-exp_values0-vucsn]>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f4e3a0a0>
from_instance = <lib389.DirSrv object at 0x7fa5f4e69af0>
to_instance = <lib389.DirSrv object at 0x7fa5f4e47b80>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/state/mmt_state_test.py::test_check_jpeg_attr_state[jpegphoto-thedeadbeef2-ldap.MOD_ADD-exp_values1-vucsn]::setup 0.00
request = <SubRequest 'topo' for <Function test_check_desc_attr_state[description-Test1usr1-ldap.MOD_ADD-exp_values0-vucsn]>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f4e3a0a0>
from_instance = <lib389.DirSrv object at 0x7fa5f4e69af0>
to_instance = <lib389.DirSrv object at 0x7fa5f4e47b80>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/state/mmt_state_test.py::test_check_jpeg_attr_state[jpegphoto-thedeadbeef3-ldap.MOD_ADD-exp_values2-vucsn]::setup 0.00
request = <SubRequest 'topo' for <Function test_check_desc_attr_state[description-Test1usr1-ldap.MOD_ADD-exp_values0-vucsn]>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f4e3a0a0>
from_instance = <lib389.DirSrv object at 0x7fa5f4e69af0>
to_instance = <lib389.DirSrv object at 0x7fa5f4e47b80>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/state/mmt_state_test.py::test_check_jpeg_attr_state[jpegphoto-thedeadbeef2-ldap.MOD_REPLACE-exp_values3-adcsn]::setup 0.00
request = <SubRequest 'topo' for <Function test_check_desc_attr_state[description-Test1usr1-ldap.MOD_ADD-exp_values0-vucsn]>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f4e3a0a0>
from_instance = <lib389.DirSrv object at 0x7fa5f4e69af0>
to_instance = <lib389.DirSrv object at 0x7fa5f4e47b80>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/state/mmt_state_test.py::test_check_jpeg_attr_state[jpegphoto-thedeadbeef2-ldap.MOD_DELETE-exp_values4-vdcsn]::setup 0.00
request = <SubRequest 'topo' for <Function test_check_desc_attr_state[description-Test1usr1-ldap.MOD_ADD-exp_values0-vucsn]>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f4e3a0a0>
from_instance = <lib389.DirSrv object at 0x7fa5f4e69af0>
to_instance = <lib389.DirSrv object at 0x7fa5f4e47b80>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error suites/vlv/regression_test.py::test_bulk_import_when_the_backend_with_vlv_was_recreated::setup 40.12
request = <SubRequest 'topology_m2' for <Function test_bulk_import_when_the_backend_with_vlv_was_recreated>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f4d6aa30>
from_instance = <lib389.DirSrv object at 0x7fa5f4d6e6a0>
to_instance = <lib389.DirSrv object at 0x7fa5f486fd30>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-master2 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:162 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 08a496fd-092f-4022-a416-479037e37909 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 08a496fd-092f-4022-a416-479037e37909 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 08a496fd-092f-4022-a416-479037e37909 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 08a496fd-092f-4022-a416-479037e37909 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 08a496fd-092f-4022-a416-479037e37909 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 08a496fd-092f-4022-a416-479037e37909 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 08a496fd-092f-4022-a416-479037e37909 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 08a496fd-092f-4022-a416-479037e37909 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 08a496fd-092f-4022-a416-479037e37909 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 08a496fd-092f-4022-a416-479037e37909 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 08a496fd-092f-4022-a416-479037e37909 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 08a496fd-092f-4022-a416-479037e37909 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 08a496fd-092f-4022-a416-479037e37909 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 08a496fd-092f-4022-a416-479037e37909 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 08a496fd-092f-4022-a416-479037e37909 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 08a496fd-092f-4022-a416-479037e37909 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 08a496fd-092f-4022-a416-479037e37909 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 08a496fd-092f-4022-a416-479037e37909 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 08a496fd-092f-4022-a416-479037e37909 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 08a496fd-092f-4022-a416-479037e37909 / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 08a496fd-092f-4022-a416-479037e37909 / got description=None)
Error tickets/ticket47573_test.py::test_ticket47573_init::setup 40.85
request = <SubRequest 'topology_m1c1' for <Function test_ticket47573_init>>

@pytest.fixture(scope="module")
def topology_m1c1(request):
"""Create Replication Deployment with one master and one consumer"""

> topology = create_topology({ReplicaRole.MASTER: 1,
ReplicaRole.CONSUMER: 1})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:416:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:176: in create_topology
repl.join_consumer(first_master, c)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2282: in join_consumer
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f48d5640>
from_instance = <lib389.DirSrv object at 0x7fa5f4d54940>
to_instance = <lib389.DirSrv object at 0x7fa5f4863880>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-consumer1 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39201, 'ldap-secureport': 63901, 'server-id': 'consumer1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:175 Joining consumer consumer1 from master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 583ea779-58f5-4b20-b426-0f79ffd811c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 583ea779-58f5-4b20-b426-0f79ffd811c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 583ea779-58f5-4b20-b426-0f79ffd811c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 583ea779-58f5-4b20-b426-0f79ffd811c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 583ea779-58f5-4b20-b426-0f79ffd811c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 583ea779-58f5-4b20-b426-0f79ffd811c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 583ea779-58f5-4b20-b426-0f79ffd811c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 583ea779-58f5-4b20-b426-0f79ffd811c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 583ea779-58f5-4b20-b426-0f79ffd811c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 583ea779-58f5-4b20-b426-0f79ffd811c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 583ea779-58f5-4b20-b426-0f79ffd811c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 583ea779-58f5-4b20-b426-0f79ffd811c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 583ea779-58f5-4b20-b426-0f79ffd811c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 583ea779-58f5-4b20-b426-0f79ffd811c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 583ea779-58f5-4b20-b426-0f79ffd811c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 583ea779-58f5-4b20-b426-0f79ffd811c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 583ea779-58f5-4b20-b426-0f79ffd811c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 583ea779-58f5-4b20-b426-0f79ffd811c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 583ea779-58f5-4b20-b426-0f79ffd811c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 583ea779-58f5-4b20-b426-0f79ffd811c9 / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 583ea779-58f5-4b20-b426-0f79ffd811c9 / got description=None)
Error tickets/ticket47573_test.py::test_ticket47573_one::setup 0.00
request = <SubRequest 'topology_m1c1' for <Function test_ticket47573_init>>

@pytest.fixture(scope="module")
def topology_m1c1(request):
"""Create Replication Deployment with one master and one consumer"""

> topology = create_topology({ReplicaRole.MASTER: 1,
ReplicaRole.CONSUMER: 1})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:416:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:176: in create_topology
repl.join_consumer(first_master, c)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2282: in join_consumer
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f48d5640>
from_instance = <lib389.DirSrv object at 0x7fa5f4d54940>
to_instance = <lib389.DirSrv object at 0x7fa5f4863880>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket47573_test.py::test_ticket47573_two::setup 0.00
request = <SubRequest 'topology_m1c1' for <Function test_ticket47573_init>>

@pytest.fixture(scope="module")
def topology_m1c1(request):
"""Create Replication Deployment with one master and one consumer"""

> topology = create_topology({ReplicaRole.MASTER: 1,
ReplicaRole.CONSUMER: 1})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:416:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:176: in create_topology
repl.join_consumer(first_master, c)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2282: in join_consumer
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f48d5640>
from_instance = <lib389.DirSrv object at 0x7fa5f4d54940>
to_instance = <lib389.DirSrv object at 0x7fa5f4863880>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket47573_test.py::test_ticket47573_three::setup 0.00
request = <SubRequest 'topology_m1c1' for <Function test_ticket47573_init>>

@pytest.fixture(scope="module")
def topology_m1c1(request):
"""Create Replication Deployment with one master and one consumer"""

> topology = create_topology({ReplicaRole.MASTER: 1,
ReplicaRole.CONSUMER: 1})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:416:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:176: in create_topology
repl.join_consumer(first_master, c)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2282: in join_consumer
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f48d5640>
from_instance = <lib389.DirSrv object at 0x7fa5f4d54940>
to_instance = <lib389.DirSrv object at 0x7fa5f4863880>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket47619_test.py::test_ticket47619_init::setup 40.52
request = <SubRequest 'topology_m1c1' for <Function test_ticket47619_init>>

@pytest.fixture(scope="module")
def topology_m1c1(request):
"""Create Replication Deployment with one master and one consumer"""

> topology = create_topology({ReplicaRole.MASTER: 1,
ReplicaRole.CONSUMER: 1})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:416:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:176: in create_topology
repl.join_consumer(first_master, c)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2282: in join_consumer
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f4ed13a0>
from_instance = <lib389.DirSrv object at 0x7fa5f49b6370>
to_instance = <lib389.DirSrv object at 0x7fa5f49b62e0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-consumer1 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39201, 'ldap-secureport': 63901, 'server-id': 'consumer1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:175 Joining consumer consumer1 from master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 493193d3-284a-4854-98bf-7795e086a692 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 493193d3-284a-4854-98bf-7795e086a692 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 493193d3-284a-4854-98bf-7795e086a692 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 493193d3-284a-4854-98bf-7795e086a692 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 493193d3-284a-4854-98bf-7795e086a692 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 493193d3-284a-4854-98bf-7795e086a692 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 493193d3-284a-4854-98bf-7795e086a692 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 493193d3-284a-4854-98bf-7795e086a692 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 493193d3-284a-4854-98bf-7795e086a692 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 493193d3-284a-4854-98bf-7795e086a692 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 493193d3-284a-4854-98bf-7795e086a692 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 493193d3-284a-4854-98bf-7795e086a692 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 493193d3-284a-4854-98bf-7795e086a692 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 493193d3-284a-4854-98bf-7795e086a692 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 493193d3-284a-4854-98bf-7795e086a692 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 493193d3-284a-4854-98bf-7795e086a692 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 493193d3-284a-4854-98bf-7795e086a692 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 493193d3-284a-4854-98bf-7795e086a692 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 493193d3-284a-4854-98bf-7795e086a692 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 493193d3-284a-4854-98bf-7795e086a692 / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 493193d3-284a-4854-98bf-7795e086a692 / got description=None)
Error tickets/ticket47619_test.py::test_ticket47619_create_index::setup 0.00
request = <SubRequest 'topology_m1c1' for <Function test_ticket47619_init>>

@pytest.fixture(scope="module")
def topology_m1c1(request):
"""Create Replication Deployment with one master and one consumer"""

> topology = create_topology({ReplicaRole.MASTER: 1,
ReplicaRole.CONSUMER: 1})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:416:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:176: in create_topology
repl.join_consumer(first_master, c)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2282: in join_consumer
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f4ed13a0>
from_instance = <lib389.DirSrv object at 0x7fa5f49b6370>
to_instance = <lib389.DirSrv object at 0x7fa5f49b62e0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket47619_test.py::test_ticket47619_reindex::setup 0.00
request = <SubRequest 'topology_m1c1' for <Function test_ticket47619_init>>

@pytest.fixture(scope="module")
def topology_m1c1(request):
"""Create Replication Deployment with one master and one consumer"""

> topology = create_topology({ReplicaRole.MASTER: 1,
ReplicaRole.CONSUMER: 1})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:416:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:176: in create_topology
repl.join_consumer(first_master, c)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2282: in join_consumer
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f4ed13a0>
from_instance = <lib389.DirSrv object at 0x7fa5f49b6370>
to_instance = <lib389.DirSrv object at 0x7fa5f49b62e0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket47619_test.py::test_ticket47619_check_indexed_search::setup 0.00
request = <SubRequest 'topology_m1c1' for <Function test_ticket47619_init>>

@pytest.fixture(scope="module")
def topology_m1c1(request):
"""Create Replication Deployment with one master and one consumer"""

> topology = create_topology({ReplicaRole.MASTER: 1,
ReplicaRole.CONSUMER: 1})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:416:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:176: in create_topology
repl.join_consumer(first_master, c)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2282: in join_consumer
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f4ed13a0>
from_instance = <lib389.DirSrv object at 0x7fa5f49b6370>
to_instance = <lib389.DirSrv object at 0x7fa5f49b62e0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket47653MMR_test.py::test_ticket47653_init::setup 41.27
request = <SubRequest 'topology_m2' for <Function test_ticket47653_init>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f4ac5ca0>
from_instance = <lib389.DirSrv object at 0x7fa5f49a0a00>
to_instance = <lib389.DirSrv object at 0x7fa5f49919a0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-master2 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:162 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 562b400d-3cf4-4b28-8416-32a8de4c32e1 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 562b400d-3cf4-4b28-8416-32a8de4c32e1 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 562b400d-3cf4-4b28-8416-32a8de4c32e1 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 562b400d-3cf4-4b28-8416-32a8de4c32e1 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 562b400d-3cf4-4b28-8416-32a8de4c32e1 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 562b400d-3cf4-4b28-8416-32a8de4c32e1 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 562b400d-3cf4-4b28-8416-32a8de4c32e1 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 562b400d-3cf4-4b28-8416-32a8de4c32e1 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 562b400d-3cf4-4b28-8416-32a8de4c32e1 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 562b400d-3cf4-4b28-8416-32a8de4c32e1 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 562b400d-3cf4-4b28-8416-32a8de4c32e1 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 562b400d-3cf4-4b28-8416-32a8de4c32e1 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 562b400d-3cf4-4b28-8416-32a8de4c32e1 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 562b400d-3cf4-4b28-8416-32a8de4c32e1 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 562b400d-3cf4-4b28-8416-32a8de4c32e1 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 562b400d-3cf4-4b28-8416-32a8de4c32e1 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 562b400d-3cf4-4b28-8416-32a8de4c32e1 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 562b400d-3cf4-4b28-8416-32a8de4c32e1 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 562b400d-3cf4-4b28-8416-32a8de4c32e1 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 562b400d-3cf4-4b28-8416-32a8de4c32e1 / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 562b400d-3cf4-4b28-8416-32a8de4c32e1 / got description=None)
Error tickets/ticket47653MMR_test.py::test_ticket47653_add::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_ticket47653_init>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f4ac5ca0>
from_instance = <lib389.DirSrv object at 0x7fa5f49a0a00>
to_instance = <lib389.DirSrv object at 0x7fa5f49919a0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket47653MMR_test.py::test_ticket47653_modify::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_ticket47653_init>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f4ac5ca0>
from_instance = <lib389.DirSrv object at 0x7fa5f49a0a00>
to_instance = <lib389.DirSrv object at 0x7fa5f49919a0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket47676_test.py::test_ticket47676_init::setup 41.34
request = <SubRequest 'topology_m2' for <Function test_ticket47676_init>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f4af3b20>
from_instance = <lib389.DirSrv object at 0x7fa5f48190a0>
to_instance = <lib389.DirSrv object at 0x7fa5f4ad13a0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-master2 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:162 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c78e451e-04ca-4184-86e4-b28d1e7080ec / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c78e451e-04ca-4184-86e4-b28d1e7080ec / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c78e451e-04ca-4184-86e4-b28d1e7080ec / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c78e451e-04ca-4184-86e4-b28d1e7080ec / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c78e451e-04ca-4184-86e4-b28d1e7080ec / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c78e451e-04ca-4184-86e4-b28d1e7080ec / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c78e451e-04ca-4184-86e4-b28d1e7080ec / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c78e451e-04ca-4184-86e4-b28d1e7080ec / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c78e451e-04ca-4184-86e4-b28d1e7080ec / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c78e451e-04ca-4184-86e4-b28d1e7080ec / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c78e451e-04ca-4184-86e4-b28d1e7080ec / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c78e451e-04ca-4184-86e4-b28d1e7080ec / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c78e451e-04ca-4184-86e4-b28d1e7080ec / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c78e451e-04ca-4184-86e4-b28d1e7080ec / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c78e451e-04ca-4184-86e4-b28d1e7080ec / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c78e451e-04ca-4184-86e4-b28d1e7080ec / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c78e451e-04ca-4184-86e4-b28d1e7080ec / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c78e451e-04ca-4184-86e4-b28d1e7080ec / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c78e451e-04ca-4184-86e4-b28d1e7080ec / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c78e451e-04ca-4184-86e4-b28d1e7080ec / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c78e451e-04ca-4184-86e4-b28d1e7080ec / got description=None)
Error tickets/ticket47676_test.py::test_ticket47676_skip_oc_at::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_ticket47676_init>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f4af3b20>
from_instance = <lib389.DirSrv object at 0x7fa5f48190a0>
to_instance = <lib389.DirSrv object at 0x7fa5f4ad13a0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket47676_test.py::test_ticket47676_reject_action::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_ticket47676_init>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f4af3b20>
from_instance = <lib389.DirSrv object at 0x7fa5f48190a0>
to_instance = <lib389.DirSrv object at 0x7fa5f4ad13a0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket47721_test.py::test_ticket47721_init::setup 42.82
request = <SubRequest 'topology_m2' for <Function test_ticket47721_init>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f52265e0>
from_instance = <lib389.DirSrv object at 0x7fa5f46c8670>
to_instance = <lib389.DirSrv object at 0x7fa5f46cb310>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-master2 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:162 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 508ae4d2-5fba-44eb-917f-b2e48d302a81 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 508ae4d2-5fba-44eb-917f-b2e48d302a81 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 508ae4d2-5fba-44eb-917f-b2e48d302a81 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 508ae4d2-5fba-44eb-917f-b2e48d302a81 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 508ae4d2-5fba-44eb-917f-b2e48d302a81 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 508ae4d2-5fba-44eb-917f-b2e48d302a81 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 508ae4d2-5fba-44eb-917f-b2e48d302a81 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 508ae4d2-5fba-44eb-917f-b2e48d302a81 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 508ae4d2-5fba-44eb-917f-b2e48d302a81 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 508ae4d2-5fba-44eb-917f-b2e48d302a81 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 508ae4d2-5fba-44eb-917f-b2e48d302a81 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 508ae4d2-5fba-44eb-917f-b2e48d302a81 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 508ae4d2-5fba-44eb-917f-b2e48d302a81 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 508ae4d2-5fba-44eb-917f-b2e48d302a81 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 508ae4d2-5fba-44eb-917f-b2e48d302a81 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 508ae4d2-5fba-44eb-917f-b2e48d302a81 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 508ae4d2-5fba-44eb-917f-b2e48d302a81 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 508ae4d2-5fba-44eb-917f-b2e48d302a81 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 508ae4d2-5fba-44eb-917f-b2e48d302a81 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 508ae4d2-5fba-44eb-917f-b2e48d302a81 / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 508ae4d2-5fba-44eb-917f-b2e48d302a81 / got description=None)
Error tickets/ticket47721_test.py::test_ticket47721_0::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_ticket47721_init>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f52265e0>
from_instance = <lib389.DirSrv object at 0x7fa5f46c8670>
to_instance = <lib389.DirSrv object at 0x7fa5f46cb310>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket47721_test.py::test_ticket47721_1::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_ticket47721_init>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f52265e0>
from_instance = <lib389.DirSrv object at 0x7fa5f46c8670>
to_instance = <lib389.DirSrv object at 0x7fa5f46cb310>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket47721_test.py::test_ticket47721_2::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_ticket47721_init>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f52265e0>
from_instance = <lib389.DirSrv object at 0x7fa5f46c8670>
to_instance = <lib389.DirSrv object at 0x7fa5f46cb310>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket47721_test.py::test_ticket47721_3::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_ticket47721_init>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f52265e0>
from_instance = <lib389.DirSrv object at 0x7fa5f46c8670>
to_instance = <lib389.DirSrv object at 0x7fa5f46cb310>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket47721_test.py::test_ticket47721_4::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_ticket47721_init>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f52265e0>
from_instance = <lib389.DirSrv object at 0x7fa5f46c8670>
to_instance = <lib389.DirSrv object at 0x7fa5f46cb310>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket47787_test.py::test_ticket47787_init::setup 43.39
request = <SubRequest 'topology_m2' for <Function test_ticket47787_init>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f487ec70>
from_instance = <lib389.DirSrv object at 0x7fa5f48385b0>
to_instance = <lib389.DirSrv object at 0x7fa5f487a3a0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-master2 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:162 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 07cf1f1e-ac18-4923-b48d-79b9e5b935ec / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 07cf1f1e-ac18-4923-b48d-79b9e5b935ec / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 07cf1f1e-ac18-4923-b48d-79b9e5b935ec / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 07cf1f1e-ac18-4923-b48d-79b9e5b935ec / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 07cf1f1e-ac18-4923-b48d-79b9e5b935ec / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 07cf1f1e-ac18-4923-b48d-79b9e5b935ec / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 07cf1f1e-ac18-4923-b48d-79b9e5b935ec / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 07cf1f1e-ac18-4923-b48d-79b9e5b935ec / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 07cf1f1e-ac18-4923-b48d-79b9e5b935ec / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 07cf1f1e-ac18-4923-b48d-79b9e5b935ec / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 07cf1f1e-ac18-4923-b48d-79b9e5b935ec / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 07cf1f1e-ac18-4923-b48d-79b9e5b935ec / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 07cf1f1e-ac18-4923-b48d-79b9e5b935ec / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 07cf1f1e-ac18-4923-b48d-79b9e5b935ec / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 07cf1f1e-ac18-4923-b48d-79b9e5b935ec / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 07cf1f1e-ac18-4923-b48d-79b9e5b935ec / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 07cf1f1e-ac18-4923-b48d-79b9e5b935ec / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 07cf1f1e-ac18-4923-b48d-79b9e5b935ec / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 07cf1f1e-ac18-4923-b48d-79b9e5b935ec / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 07cf1f1e-ac18-4923-b48d-79b9e5b935ec / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 07cf1f1e-ac18-4923-b48d-79b9e5b935ec / got description=None)
Error tickets/ticket47787_test.py::test_ticket47787_2::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_ticket47787_init>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f487ec70>
from_instance = <lib389.DirSrv object at 0x7fa5f48385b0>
to_instance = <lib389.DirSrv object at 0x7fa5f487a3a0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket47869MMR_test.py::test_ticket47869_init::setup 41.11
request = <SubRequest 'topology_m2' for <Function test_ticket47869_init>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f4962070>
from_instance = <lib389.DirSrv object at 0x7fa5f4691f70>
to_instance = <lib389.DirSrv object at 0x7fa5f4827d30>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-master2 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:162 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 2cfb9898-07c5-4653-a317-8299845150cb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 2cfb9898-07c5-4653-a317-8299845150cb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 2cfb9898-07c5-4653-a317-8299845150cb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 2cfb9898-07c5-4653-a317-8299845150cb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 2cfb9898-07c5-4653-a317-8299845150cb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 2cfb9898-07c5-4653-a317-8299845150cb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 2cfb9898-07c5-4653-a317-8299845150cb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 2cfb9898-07c5-4653-a317-8299845150cb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 2cfb9898-07c5-4653-a317-8299845150cb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 2cfb9898-07c5-4653-a317-8299845150cb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 2cfb9898-07c5-4653-a317-8299845150cb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 2cfb9898-07c5-4653-a317-8299845150cb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 2cfb9898-07c5-4653-a317-8299845150cb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 2cfb9898-07c5-4653-a317-8299845150cb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 2cfb9898-07c5-4653-a317-8299845150cb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 2cfb9898-07c5-4653-a317-8299845150cb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 2cfb9898-07c5-4653-a317-8299845150cb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 2cfb9898-07c5-4653-a317-8299845150cb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 2cfb9898-07c5-4653-a317-8299845150cb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 2cfb9898-07c5-4653-a317-8299845150cb / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 2cfb9898-07c5-4653-a317-8299845150cb / got description=None)
Error tickets/ticket47869MMR_test.py::test_ticket47869_check::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_ticket47869_init>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f4962070>
from_instance = <lib389.DirSrv object at 0x7fa5f4691f70>
to_instance = <lib389.DirSrv object at 0x7fa5f4827d30>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket47871_test.py::test_ticket47871_init::setup 40.15
request = <SubRequest 'topology_m1c1' for <Function test_ticket47871_init>>

@pytest.fixture(scope="module")
def topology_m1c1(request):
"""Create Replication Deployment with one master and one consumer"""

> topology = create_topology({ReplicaRole.MASTER: 1,
ReplicaRole.CONSUMER: 1})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:416:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:176: in create_topology
repl.join_consumer(first_master, c)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2282: in join_consumer
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f45f6190>
from_instance = <lib389.DirSrv object at 0x7fa5f459dd60>
to_instance = <lib389.DirSrv object at 0x7fa5f459ddc0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-consumer1 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39201, 'ldap-secureport': 63901, 'server-id': 'consumer1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:175 Joining consumer consumer1 from master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 82343823-7cdd-47ed-bc31-6fd84c63e117 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 82343823-7cdd-47ed-bc31-6fd84c63e117 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 82343823-7cdd-47ed-bc31-6fd84c63e117 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 82343823-7cdd-47ed-bc31-6fd84c63e117 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 82343823-7cdd-47ed-bc31-6fd84c63e117 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 82343823-7cdd-47ed-bc31-6fd84c63e117 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 82343823-7cdd-47ed-bc31-6fd84c63e117 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 82343823-7cdd-47ed-bc31-6fd84c63e117 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 82343823-7cdd-47ed-bc31-6fd84c63e117 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 82343823-7cdd-47ed-bc31-6fd84c63e117 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 82343823-7cdd-47ed-bc31-6fd84c63e117 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 82343823-7cdd-47ed-bc31-6fd84c63e117 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 82343823-7cdd-47ed-bc31-6fd84c63e117 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 82343823-7cdd-47ed-bc31-6fd84c63e117 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 82343823-7cdd-47ed-bc31-6fd84c63e117 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 82343823-7cdd-47ed-bc31-6fd84c63e117 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 82343823-7cdd-47ed-bc31-6fd84c63e117 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 82343823-7cdd-47ed-bc31-6fd84c63e117 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 82343823-7cdd-47ed-bc31-6fd84c63e117 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 82343823-7cdd-47ed-bc31-6fd84c63e117 / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 82343823-7cdd-47ed-bc31-6fd84c63e117 / got description=None)
Error tickets/ticket47871_test.py::test_ticket47871_1::setup 0.00
request = <SubRequest 'topology_m1c1' for <Function test_ticket47871_init>>

@pytest.fixture(scope="module")
def topology_m1c1(request):
"""Create Replication Deployment with one master and one consumer"""

> topology = create_topology({ReplicaRole.MASTER: 1,
ReplicaRole.CONSUMER: 1})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:416:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:176: in create_topology
repl.join_consumer(first_master, c)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2282: in join_consumer
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f45f6190>
from_instance = <lib389.DirSrv object at 0x7fa5f459dd60>
to_instance = <lib389.DirSrv object at 0x7fa5f459ddc0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket47871_test.py::test_ticket47871_2::setup 0.00
request = <SubRequest 'topology_m1c1' for <Function test_ticket47871_init>>

@pytest.fixture(scope="module")
def topology_m1c1(request):
"""Create Replication Deployment with one master and one consumer"""

> topology = create_topology({ReplicaRole.MASTER: 1,
ReplicaRole.CONSUMER: 1})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:416:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:176: in create_topology
repl.join_consumer(first_master, c)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2282: in join_consumer
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f45f6190>
from_instance = <lib389.DirSrv object at 0x7fa5f459dd60>
to_instance = <lib389.DirSrv object at 0x7fa5f459ddc0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket47988_test.py::test_ticket47988_init::setup 40.08
request = <SubRequest 'topology_m2' for <Function test_ticket47988_init>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f41299a0>
from_instance = <lib389.DirSrv object at 0x7fa5f41e66a0>
to_instance = <lib389.DirSrv object at 0x7fa5f40fcac0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-master2 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:162 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9ff147e9-90db-4872-92b6-335b1a0411d7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9ff147e9-90db-4872-92b6-335b1a0411d7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9ff147e9-90db-4872-92b6-335b1a0411d7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9ff147e9-90db-4872-92b6-335b1a0411d7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9ff147e9-90db-4872-92b6-335b1a0411d7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9ff147e9-90db-4872-92b6-335b1a0411d7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9ff147e9-90db-4872-92b6-335b1a0411d7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9ff147e9-90db-4872-92b6-335b1a0411d7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9ff147e9-90db-4872-92b6-335b1a0411d7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9ff147e9-90db-4872-92b6-335b1a0411d7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9ff147e9-90db-4872-92b6-335b1a0411d7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9ff147e9-90db-4872-92b6-335b1a0411d7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9ff147e9-90db-4872-92b6-335b1a0411d7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9ff147e9-90db-4872-92b6-335b1a0411d7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9ff147e9-90db-4872-92b6-335b1a0411d7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9ff147e9-90db-4872-92b6-335b1a0411d7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9ff147e9-90db-4872-92b6-335b1a0411d7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9ff147e9-90db-4872-92b6-335b1a0411d7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9ff147e9-90db-4872-92b6-335b1a0411d7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9ff147e9-90db-4872-92b6-335b1a0411d7 / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9ff147e9-90db-4872-92b6-335b1a0411d7 / got description=None)
Error tickets/ticket47988_test.py::test_ticket47988_1::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_ticket47988_init>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f41299a0>
from_instance = <lib389.DirSrv object at 0x7fa5f41e66a0>
to_instance = <lib389.DirSrv object at 0x7fa5f40fcac0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket47988_test.py::test_ticket47988_2::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_ticket47988_init>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f41299a0>
from_instance = <lib389.DirSrv object at 0x7fa5f41e66a0>
to_instance = <lib389.DirSrv object at 0x7fa5f40fcac0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket47988_test.py::test_ticket47988_3::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_ticket47988_init>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f41299a0>
from_instance = <lib389.DirSrv object at 0x7fa5f41e66a0>
to_instance = <lib389.DirSrv object at 0x7fa5f40fcac0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket47988_test.py::test_ticket47988_4::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_ticket47988_init>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f41299a0>
from_instance = <lib389.DirSrv object at 0x7fa5f41e66a0>
to_instance = <lib389.DirSrv object at 0x7fa5f40fcac0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket47988_test.py::test_ticket47988_5::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_ticket47988_init>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f41299a0>
from_instance = <lib389.DirSrv object at 0x7fa5f41e66a0>
to_instance = <lib389.DirSrv object at 0x7fa5f40fcac0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket47988_test.py::test_ticket47988_6::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_ticket47988_init>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f41299a0>
from_instance = <lib389.DirSrv object at 0x7fa5f41e66a0>
to_instance = <lib389.DirSrv object at 0x7fa5f40fcac0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket48266_test.py::test_ticket48266_fractional::setup 39.29
request = <SubRequest 'topology_m2' for <Function test_ticket48266_fractional>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5efc7b550>
from_instance = <lib389.DirSrv object at 0x7fa5efc7c310>
to_instance = <lib389.DirSrv object at 0x7fa5efc62eb0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-master2 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:162 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9df53bde-2ce0-4b2a-9821-66ae2064ed1a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9df53bde-2ce0-4b2a-9821-66ae2064ed1a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9df53bde-2ce0-4b2a-9821-66ae2064ed1a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9df53bde-2ce0-4b2a-9821-66ae2064ed1a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9df53bde-2ce0-4b2a-9821-66ae2064ed1a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9df53bde-2ce0-4b2a-9821-66ae2064ed1a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9df53bde-2ce0-4b2a-9821-66ae2064ed1a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9df53bde-2ce0-4b2a-9821-66ae2064ed1a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9df53bde-2ce0-4b2a-9821-66ae2064ed1a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9df53bde-2ce0-4b2a-9821-66ae2064ed1a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9df53bde-2ce0-4b2a-9821-66ae2064ed1a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9df53bde-2ce0-4b2a-9821-66ae2064ed1a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9df53bde-2ce0-4b2a-9821-66ae2064ed1a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9df53bde-2ce0-4b2a-9821-66ae2064ed1a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9df53bde-2ce0-4b2a-9821-66ae2064ed1a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9df53bde-2ce0-4b2a-9821-66ae2064ed1a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9df53bde-2ce0-4b2a-9821-66ae2064ed1a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9df53bde-2ce0-4b2a-9821-66ae2064ed1a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9df53bde-2ce0-4b2a-9821-66ae2064ed1a / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9df53bde-2ce0-4b2a-9821-66ae2064ed1a / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9df53bde-2ce0-4b2a-9821-66ae2064ed1a / got description=None)
Error tickets/ticket48266_test.py::test_ticket48266_check_repl_desc::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_ticket48266_fractional>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5efc7b550>
from_instance = <lib389.DirSrv object at 0x7fa5efc7c310>
to_instance = <lib389.DirSrv object at 0x7fa5efc62eb0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket48266_test.py::test_ticket48266_count_csn_evaluation::setup 0.00
request = <SubRequest 'topology_m2' for <Function test_ticket48266_fractional>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5efc7b550>
from_instance = <lib389.DirSrv object at 0x7fa5efc7c310>
to_instance = <lib389.DirSrv object at 0x7fa5efc62eb0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket48325_test.py::test_ticket48325::setup 49.00
request = <SubRequest 'topology_m1h1c1' for <Function test_ticket48325>>

@pytest.fixture(scope="module")
def topology_m1h1c1(request):
"""Create Replication Deployment with one master, one consumer and one hub"""

topo_roles = {ReplicaRole.MASTER: 1, ReplicaRole.HUB: 1, ReplicaRole.CONSUMER: 1}
topology = _create_instances(topo_roles, DEFAULT_SUFFIX)
master = topology.ms["master1"]
hub = topology.hs["hub1"]
consumer = topology.cs["consumer1"]

# Start with the master, and create it "first".
log.info("Creating replication topology.")
# Now get the first master ready.
repl = ReplicationManager(DEFAULT_SUFFIX)
repl.create_first_master(master)
# Finish the topology creation
> repl.join_hub(master, hub)

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:520:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2226: in join_hub
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5eff0d310>
from_instance = <lib389.DirSrv object at 0x7fa5eff76790>
to_instance = <lib389.DirSrv object at 0x7fa5eff769d0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-hub1 removed. Instance slapd-consumer1 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39101, 'ldap-secureport': 63801, 'server-id': 'hub1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39201, 'ldap-secureport': 63901, 'server-id': 'consumer1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:515 Creating replication topology. INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 68d6a44d-ee7f-4240-9c5f-3a0555fa30e7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 68d6a44d-ee7f-4240-9c5f-3a0555fa30e7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 68d6a44d-ee7f-4240-9c5f-3a0555fa30e7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 68d6a44d-ee7f-4240-9c5f-3a0555fa30e7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 68d6a44d-ee7f-4240-9c5f-3a0555fa30e7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 68d6a44d-ee7f-4240-9c5f-3a0555fa30e7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 68d6a44d-ee7f-4240-9c5f-3a0555fa30e7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 68d6a44d-ee7f-4240-9c5f-3a0555fa30e7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 68d6a44d-ee7f-4240-9c5f-3a0555fa30e7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 68d6a44d-ee7f-4240-9c5f-3a0555fa30e7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 68d6a44d-ee7f-4240-9c5f-3a0555fa30e7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 68d6a44d-ee7f-4240-9c5f-3a0555fa30e7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 68d6a44d-ee7f-4240-9c5f-3a0555fa30e7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 68d6a44d-ee7f-4240-9c5f-3a0555fa30e7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 68d6a44d-ee7f-4240-9c5f-3a0555fa30e7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 68d6a44d-ee7f-4240-9c5f-3a0555fa30e7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 68d6a44d-ee7f-4240-9c5f-3a0555fa30e7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 68d6a44d-ee7f-4240-9c5f-3a0555fa30e7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 68d6a44d-ee7f-4240-9c5f-3a0555fa30e7 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 68d6a44d-ee7f-4240-9c5f-3a0555fa30e7 / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 68d6a44d-ee7f-4240-9c5f-3a0555fa30e7 / got description=None)
Error tickets/ticket48342_test.py::test_ticket4026::setup 47.77
request = <SubRequest 'topology_m3' for <Function test_ticket4026>>

@pytest.fixture(scope="module")
def topology_m3(request):
"""Create Replication Deployment with three masters"""

> topology = create_topology({ReplicaRole.MASTER: 3})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:453:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5eff28a30>
from_instance = <lib389.DirSrv object at 0x7fa5f4177100>
to_instance = <lib389.DirSrv object at 0x7fa5f40b38b0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-master2 removed. Instance slapd-master3 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39003, 'ldap-secureport': 63703, 'server-id': 'master3', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:162 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect be19d619-0d23-48c3-af44-a9efc8a199bb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect be19d619-0d23-48c3-af44-a9efc8a199bb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect be19d619-0d23-48c3-af44-a9efc8a199bb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect be19d619-0d23-48c3-af44-a9efc8a199bb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect be19d619-0d23-48c3-af44-a9efc8a199bb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect be19d619-0d23-48c3-af44-a9efc8a199bb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect be19d619-0d23-48c3-af44-a9efc8a199bb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect be19d619-0d23-48c3-af44-a9efc8a199bb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect be19d619-0d23-48c3-af44-a9efc8a199bb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect be19d619-0d23-48c3-af44-a9efc8a199bb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect be19d619-0d23-48c3-af44-a9efc8a199bb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect be19d619-0d23-48c3-af44-a9efc8a199bb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect be19d619-0d23-48c3-af44-a9efc8a199bb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect be19d619-0d23-48c3-af44-a9efc8a199bb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect be19d619-0d23-48c3-af44-a9efc8a199bb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect be19d619-0d23-48c3-af44-a9efc8a199bb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect be19d619-0d23-48c3-af44-a9efc8a199bb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect be19d619-0d23-48c3-af44-a9efc8a199bb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect be19d619-0d23-48c3-af44-a9efc8a199bb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect be19d619-0d23-48c3-af44-a9efc8a199bb / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect be19d619-0d23-48c3-af44-a9efc8a199bb / got description=None)
Error tickets/ticket48362_test.py::test_ticket48362::setup 41.22
request = <SubRequest 'topology_m2' for <Function test_ticket48362>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5eff8e250>
from_instance = <lib389.DirSrv object at 0x7fa5efa6feb0>
to_instance = <lib389.DirSrv object at 0x7fa5eff8ea30>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-master2 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:162 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b72f6477-60f7-4de0-821a-f71d62ef8dcf / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b72f6477-60f7-4de0-821a-f71d62ef8dcf / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b72f6477-60f7-4de0-821a-f71d62ef8dcf / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b72f6477-60f7-4de0-821a-f71d62ef8dcf / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b72f6477-60f7-4de0-821a-f71d62ef8dcf / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b72f6477-60f7-4de0-821a-f71d62ef8dcf / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b72f6477-60f7-4de0-821a-f71d62ef8dcf / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b72f6477-60f7-4de0-821a-f71d62ef8dcf / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b72f6477-60f7-4de0-821a-f71d62ef8dcf / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b72f6477-60f7-4de0-821a-f71d62ef8dcf / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b72f6477-60f7-4de0-821a-f71d62ef8dcf / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b72f6477-60f7-4de0-821a-f71d62ef8dcf / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b72f6477-60f7-4de0-821a-f71d62ef8dcf / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b72f6477-60f7-4de0-821a-f71d62ef8dcf / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b72f6477-60f7-4de0-821a-f71d62ef8dcf / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b72f6477-60f7-4de0-821a-f71d62ef8dcf / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b72f6477-60f7-4de0-821a-f71d62ef8dcf / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b72f6477-60f7-4de0-821a-f71d62ef8dcf / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b72f6477-60f7-4de0-821a-f71d62ef8dcf / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b72f6477-60f7-4de0-821a-f71d62ef8dcf / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b72f6477-60f7-4de0-821a-f71d62ef8dcf / got description=None)
Error tickets/ticket48784_test.py::test_ticket48784::setup 41.44
request = <SubRequest 'topology_m2' for <Function test_ticket48784>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5efa73520>
from_instance = <lib389.DirSrv object at 0x7fa5eff29f70>
to_instance = <lib389.DirSrv object at 0x7fa5efe9c9d0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-master2 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:162 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 3ea0a1e9-6bbc-457b-8ab2-6be8a1113fcd / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 3ea0a1e9-6bbc-457b-8ab2-6be8a1113fcd / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 3ea0a1e9-6bbc-457b-8ab2-6be8a1113fcd / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 3ea0a1e9-6bbc-457b-8ab2-6be8a1113fcd / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 3ea0a1e9-6bbc-457b-8ab2-6be8a1113fcd / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 3ea0a1e9-6bbc-457b-8ab2-6be8a1113fcd / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 3ea0a1e9-6bbc-457b-8ab2-6be8a1113fcd / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 3ea0a1e9-6bbc-457b-8ab2-6be8a1113fcd / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 3ea0a1e9-6bbc-457b-8ab2-6be8a1113fcd / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 3ea0a1e9-6bbc-457b-8ab2-6be8a1113fcd / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 3ea0a1e9-6bbc-457b-8ab2-6be8a1113fcd / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 3ea0a1e9-6bbc-457b-8ab2-6be8a1113fcd / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 3ea0a1e9-6bbc-457b-8ab2-6be8a1113fcd / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 3ea0a1e9-6bbc-457b-8ab2-6be8a1113fcd / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 3ea0a1e9-6bbc-457b-8ab2-6be8a1113fcd / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 3ea0a1e9-6bbc-457b-8ab2-6be8a1113fcd / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 3ea0a1e9-6bbc-457b-8ab2-6be8a1113fcd / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 3ea0a1e9-6bbc-457b-8ab2-6be8a1113fcd / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 3ea0a1e9-6bbc-457b-8ab2-6be8a1113fcd / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 3ea0a1e9-6bbc-457b-8ab2-6be8a1113fcd / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 3ea0a1e9-6bbc-457b-8ab2-6be8a1113fcd / got description=None)
Error tickets/ticket48799_test.py::test_ticket48799::setup 41.33
request = <SubRequest 'topology_m1c1' for <Function test_ticket48799>>

@pytest.fixture(scope="module")
def topology_m1c1(request):
"""Create Replication Deployment with one master and one consumer"""

> topology = create_topology({ReplicaRole.MASTER: 1,
ReplicaRole.CONSUMER: 1})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:416:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:176: in create_topology
repl.join_consumer(first_master, c)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2282: in join_consumer
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5efc96b20>
from_instance = <lib389.DirSrv object at 0x7fa5efe14a90>
to_instance = <lib389.DirSrv object at 0x7fa5efe14dc0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-consumer1 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39201, 'ldap-secureport': 63901, 'server-id': 'consumer1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:175 Joining consumer consumer1 from master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 6f8b72e2-9ce4-44dc-b82b-0d343ce65d26 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 6f8b72e2-9ce4-44dc-b82b-0d343ce65d26 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 6f8b72e2-9ce4-44dc-b82b-0d343ce65d26 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 6f8b72e2-9ce4-44dc-b82b-0d343ce65d26 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 6f8b72e2-9ce4-44dc-b82b-0d343ce65d26 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 6f8b72e2-9ce4-44dc-b82b-0d343ce65d26 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 6f8b72e2-9ce4-44dc-b82b-0d343ce65d26 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 6f8b72e2-9ce4-44dc-b82b-0d343ce65d26 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 6f8b72e2-9ce4-44dc-b82b-0d343ce65d26 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 6f8b72e2-9ce4-44dc-b82b-0d343ce65d26 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 6f8b72e2-9ce4-44dc-b82b-0d343ce65d26 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 6f8b72e2-9ce4-44dc-b82b-0d343ce65d26 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 6f8b72e2-9ce4-44dc-b82b-0d343ce65d26 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 6f8b72e2-9ce4-44dc-b82b-0d343ce65d26 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 6f8b72e2-9ce4-44dc-b82b-0d343ce65d26 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 6f8b72e2-9ce4-44dc-b82b-0d343ce65d26 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 6f8b72e2-9ce4-44dc-b82b-0d343ce65d26 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 6f8b72e2-9ce4-44dc-b82b-0d343ce65d26 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 6f8b72e2-9ce4-44dc-b82b-0d343ce65d26 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 6f8b72e2-9ce4-44dc-b82b-0d343ce65d26 / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 6f8b72e2-9ce4-44dc-b82b-0d343ce65d26 / got description=None)
Error tickets/ticket48916_test.py::test_ticket48916::setup 41.15
request = <SubRequest 'topology_m2' for <Function test_ticket48916>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5efc03b50>
from_instance = <lib389.DirSrv object at 0x7fa5efdcab80>
to_instance = <lib389.DirSrv object at 0x7fa5efc24070>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-master2 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:162 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect fa9ef863-82de-4fb4-9c38-d4177ccfdec0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect fa9ef863-82de-4fb4-9c38-d4177ccfdec0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect fa9ef863-82de-4fb4-9c38-d4177ccfdec0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect fa9ef863-82de-4fb4-9c38-d4177ccfdec0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect fa9ef863-82de-4fb4-9c38-d4177ccfdec0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect fa9ef863-82de-4fb4-9c38-d4177ccfdec0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect fa9ef863-82de-4fb4-9c38-d4177ccfdec0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect fa9ef863-82de-4fb4-9c38-d4177ccfdec0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect fa9ef863-82de-4fb4-9c38-d4177ccfdec0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect fa9ef863-82de-4fb4-9c38-d4177ccfdec0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect fa9ef863-82de-4fb4-9c38-d4177ccfdec0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect fa9ef863-82de-4fb4-9c38-d4177ccfdec0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect fa9ef863-82de-4fb4-9c38-d4177ccfdec0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect fa9ef863-82de-4fb4-9c38-d4177ccfdec0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect fa9ef863-82de-4fb4-9c38-d4177ccfdec0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect fa9ef863-82de-4fb4-9c38-d4177ccfdec0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect fa9ef863-82de-4fb4-9c38-d4177ccfdec0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect fa9ef863-82de-4fb4-9c38-d4177ccfdec0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect fa9ef863-82de-4fb4-9c38-d4177ccfdec0 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect fa9ef863-82de-4fb4-9c38-d4177ccfdec0 / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect fa9ef863-82de-4fb4-9c38-d4177ccfdec0 / got description=None)
Error tickets/ticket48944_test.py::test_ticket48944::setup 54.65
request = <SubRequest 'topo' for <Function test_ticket48944>>

@pytest.fixture(scope="module")
def topology_m2c2(request):
"""Create Replication Deployment with two masters and two consumers"""

> topology = create_topology({ReplicaRole.MASTER: 2,
ReplicaRole.CONSUMER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:489:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5efff7f40>
from_instance = <lib389.DirSrv object at 0x7fa5efb6b910>
to_instance = <lib389.DirSrv object at 0x7fa5efc13670>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-master2 removed. Instance slapd-consumer1 removed. Instance slapd-consumer2 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39201, 'ldap-secureport': 63901, 'server-id': 'consumer1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39202, 'ldap-secureport': 63902, 'server-id': 'consumer2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:162 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 688af649-df20-4fdd-98db-15c7a0e3754d / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 688af649-df20-4fdd-98db-15c7a0e3754d / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 688af649-df20-4fdd-98db-15c7a0e3754d / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 688af649-df20-4fdd-98db-15c7a0e3754d / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 688af649-df20-4fdd-98db-15c7a0e3754d / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 688af649-df20-4fdd-98db-15c7a0e3754d / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 688af649-df20-4fdd-98db-15c7a0e3754d / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 688af649-df20-4fdd-98db-15c7a0e3754d / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 688af649-df20-4fdd-98db-15c7a0e3754d / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 688af649-df20-4fdd-98db-15c7a0e3754d / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 688af649-df20-4fdd-98db-15c7a0e3754d / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 688af649-df20-4fdd-98db-15c7a0e3754d / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 688af649-df20-4fdd-98db-15c7a0e3754d / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 688af649-df20-4fdd-98db-15c7a0e3754d / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 688af649-df20-4fdd-98db-15c7a0e3754d / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 688af649-df20-4fdd-98db-15c7a0e3754d / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 688af649-df20-4fdd-98db-15c7a0e3754d / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 688af649-df20-4fdd-98db-15c7a0e3754d / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 688af649-df20-4fdd-98db-15c7a0e3754d / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 688af649-df20-4fdd-98db-15c7a0e3754d / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 688af649-df20-4fdd-98db-15c7a0e3754d / got description=None)
Error tickets/ticket49008_test.py::test_ticket49008::setup 48.76
request = <SubRequest 'T' for <Function test_ticket49008>>

@pytest.fixture(scope="module")
def topology_m3(request):
"""Create Replication Deployment with three masters"""

> topology = create_topology({ReplicaRole.MASTER: 3})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:453:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5ef8d2580>
from_instance = <lib389.DirSrv object at 0x7fa5efee48e0>
to_instance = <lib389.DirSrv object at 0x7fa5ef8ece20>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-master2 removed. Instance slapd-master3 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39003, 'ldap-secureport': 63703, 'server-id': 'master3', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:162 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 1829cf5a-c338-4098-9316-7f9e49acefa4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 1829cf5a-c338-4098-9316-7f9e49acefa4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 1829cf5a-c338-4098-9316-7f9e49acefa4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 1829cf5a-c338-4098-9316-7f9e49acefa4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 1829cf5a-c338-4098-9316-7f9e49acefa4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 1829cf5a-c338-4098-9316-7f9e49acefa4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 1829cf5a-c338-4098-9316-7f9e49acefa4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 1829cf5a-c338-4098-9316-7f9e49acefa4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 1829cf5a-c338-4098-9316-7f9e49acefa4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 1829cf5a-c338-4098-9316-7f9e49acefa4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 1829cf5a-c338-4098-9316-7f9e49acefa4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 1829cf5a-c338-4098-9316-7f9e49acefa4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 1829cf5a-c338-4098-9316-7f9e49acefa4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 1829cf5a-c338-4098-9316-7f9e49acefa4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 1829cf5a-c338-4098-9316-7f9e49acefa4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 1829cf5a-c338-4098-9316-7f9e49acefa4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 1829cf5a-c338-4098-9316-7f9e49acefa4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 1829cf5a-c338-4098-9316-7f9e49acefa4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 1829cf5a-c338-4098-9316-7f9e49acefa4 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 1829cf5a-c338-4098-9316-7f9e49acefa4 / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 1829cf5a-c338-4098-9316-7f9e49acefa4 / got description=None)
Error tickets/ticket49020_test.py::test_ticket49020::setup 47.18
request = <SubRequest 'T' for <Function test_ticket49020>>

@pytest.fixture(scope="module")
def topology_m3(request):
"""Create Replication Deployment with three masters"""

> topology = create_topology({ReplicaRole.MASTER: 3})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:453:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5efed2610>
from_instance = <lib389.DirSrv object at 0x7fa5efee46d0>
to_instance = <lib389.DirSrv object at 0x7fa5efec3d30>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-master2 removed. Instance slapd-master3 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39003, 'ldap-secureport': 63703, 'server-id': 'master3', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:162 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8c9dda54-e88e-4e62-aae6-eaa91eac7322 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8c9dda54-e88e-4e62-aae6-eaa91eac7322 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8c9dda54-e88e-4e62-aae6-eaa91eac7322 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8c9dda54-e88e-4e62-aae6-eaa91eac7322 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8c9dda54-e88e-4e62-aae6-eaa91eac7322 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8c9dda54-e88e-4e62-aae6-eaa91eac7322 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8c9dda54-e88e-4e62-aae6-eaa91eac7322 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8c9dda54-e88e-4e62-aae6-eaa91eac7322 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8c9dda54-e88e-4e62-aae6-eaa91eac7322 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8c9dda54-e88e-4e62-aae6-eaa91eac7322 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8c9dda54-e88e-4e62-aae6-eaa91eac7322 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8c9dda54-e88e-4e62-aae6-eaa91eac7322 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8c9dda54-e88e-4e62-aae6-eaa91eac7322 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8c9dda54-e88e-4e62-aae6-eaa91eac7322 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8c9dda54-e88e-4e62-aae6-eaa91eac7322 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8c9dda54-e88e-4e62-aae6-eaa91eac7322 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8c9dda54-e88e-4e62-aae6-eaa91eac7322 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8c9dda54-e88e-4e62-aae6-eaa91eac7322 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8c9dda54-e88e-4e62-aae6-eaa91eac7322 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8c9dda54-e88e-4e62-aae6-eaa91eac7322 / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8c9dda54-e88e-4e62-aae6-eaa91eac7322 / got description=None)
Error tickets/ticket49073_test.py::test_ticket49073::setup 40.85
request = <SubRequest 'topology_m2' for <Function test_ticket49073>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5efbbffd0>
from_instance = <lib389.DirSrv object at 0x7fa5efed8250>
to_instance = <lib389.DirSrv object at 0x7fa5efbb37c0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-master2 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:162 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect ed8d5042-ede8-4f10-9f47-3e28e2493f19 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect ed8d5042-ede8-4f10-9f47-3e28e2493f19 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect ed8d5042-ede8-4f10-9f47-3e28e2493f19 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect ed8d5042-ede8-4f10-9f47-3e28e2493f19 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect ed8d5042-ede8-4f10-9f47-3e28e2493f19 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect ed8d5042-ede8-4f10-9f47-3e28e2493f19 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect ed8d5042-ede8-4f10-9f47-3e28e2493f19 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect ed8d5042-ede8-4f10-9f47-3e28e2493f19 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect ed8d5042-ede8-4f10-9f47-3e28e2493f19 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect ed8d5042-ede8-4f10-9f47-3e28e2493f19 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect ed8d5042-ede8-4f10-9f47-3e28e2493f19 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect ed8d5042-ede8-4f10-9f47-3e28e2493f19 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect ed8d5042-ede8-4f10-9f47-3e28e2493f19 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect ed8d5042-ede8-4f10-9f47-3e28e2493f19 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect ed8d5042-ede8-4f10-9f47-3e28e2493f19 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect ed8d5042-ede8-4f10-9f47-3e28e2493f19 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect ed8d5042-ede8-4f10-9f47-3e28e2493f19 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect ed8d5042-ede8-4f10-9f47-3e28e2493f19 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect ed8d5042-ede8-4f10-9f47-3e28e2493f19 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect ed8d5042-ede8-4f10-9f47-3e28e2493f19 / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect ed8d5042-ede8-4f10-9f47-3e28e2493f19 / got description=None)
Error tickets/ticket49180_test.py::test_ticket49180::setup 53.89
request = <SubRequest 'topology_m4' for <Function test_ticket49180>>

@pytest.fixture(scope="module")
def topology_m4(request):
"""Create Replication Deployment with four masters"""

> topology = create_topology({ReplicaRole.MASTER: 4})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:471:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5ef7eb190>
from_instance = <lib389.DirSrv object at 0x7fa5ef9c9040>
to_instance = <lib389.DirSrv object at 0x7fa5ef9e0100>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-master2 removed. Instance slapd-master3 removed. Instance slapd-master4 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39003, 'ldap-secureport': 63703, 'server-id': 'master3', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39004, 'ldap-secureport': 63704, 'server-id': 'master4', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:162 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e18fdb47-f1cb-4b3f-8869-883af91cb56f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e18fdb47-f1cb-4b3f-8869-883af91cb56f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e18fdb47-f1cb-4b3f-8869-883af91cb56f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e18fdb47-f1cb-4b3f-8869-883af91cb56f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e18fdb47-f1cb-4b3f-8869-883af91cb56f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e18fdb47-f1cb-4b3f-8869-883af91cb56f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e18fdb47-f1cb-4b3f-8869-883af91cb56f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e18fdb47-f1cb-4b3f-8869-883af91cb56f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e18fdb47-f1cb-4b3f-8869-883af91cb56f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e18fdb47-f1cb-4b3f-8869-883af91cb56f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e18fdb47-f1cb-4b3f-8869-883af91cb56f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e18fdb47-f1cb-4b3f-8869-883af91cb56f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e18fdb47-f1cb-4b3f-8869-883af91cb56f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e18fdb47-f1cb-4b3f-8869-883af91cb56f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e18fdb47-f1cb-4b3f-8869-883af91cb56f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e18fdb47-f1cb-4b3f-8869-883af91cb56f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e18fdb47-f1cb-4b3f-8869-883af91cb56f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e18fdb47-f1cb-4b3f-8869-883af91cb56f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e18fdb47-f1cb-4b3f-8869-883af91cb56f / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e18fdb47-f1cb-4b3f-8869-883af91cb56f / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e18fdb47-f1cb-4b3f-8869-883af91cb56f / got description=None)
Error tickets/ticket49287_test.py::test_ticket49287::setup 40.54
request = <SubRequest 'topology_m2' for <Function test_ticket49287>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5efd26d60>
from_instance = <lib389.DirSrv object at 0x7fa5efd27a30>
to_instance = <lib389.DirSrv object at 0x7fa5efd3a940>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-master2 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:162 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect aee0b098-2cd5-4a8d-a68e-57367cde848e / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect aee0b098-2cd5-4a8d-a68e-57367cde848e / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect aee0b098-2cd5-4a8d-a68e-57367cde848e / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect aee0b098-2cd5-4a8d-a68e-57367cde848e / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect aee0b098-2cd5-4a8d-a68e-57367cde848e / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect aee0b098-2cd5-4a8d-a68e-57367cde848e / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect aee0b098-2cd5-4a8d-a68e-57367cde848e / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect aee0b098-2cd5-4a8d-a68e-57367cde848e / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect aee0b098-2cd5-4a8d-a68e-57367cde848e / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect aee0b098-2cd5-4a8d-a68e-57367cde848e / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect aee0b098-2cd5-4a8d-a68e-57367cde848e / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect aee0b098-2cd5-4a8d-a68e-57367cde848e / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect aee0b098-2cd5-4a8d-a68e-57367cde848e / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect aee0b098-2cd5-4a8d-a68e-57367cde848e / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect aee0b098-2cd5-4a8d-a68e-57367cde848e / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect aee0b098-2cd5-4a8d-a68e-57367cde848e / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect aee0b098-2cd5-4a8d-a68e-57367cde848e / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect aee0b098-2cd5-4a8d-a68e-57367cde848e / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect aee0b098-2cd5-4a8d-a68e-57367cde848e / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect aee0b098-2cd5-4a8d-a68e-57367cde848e / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect aee0b098-2cd5-4a8d-a68e-57367cde848e / got description=None)
Error tickets/ticket49412_test.py::test_ticket49412::setup 42.50
request = <SubRequest 'topo' for <Function test_ticket49412>>

@pytest.fixture(scope="module")
def topology_m1c1(request):
"""Create Replication Deployment with one master and one consumer"""

> topology = create_topology({ReplicaRole.MASTER: 1,
ReplicaRole.CONSUMER: 1})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:416:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:176: in create_topology
repl.join_consumer(first_master, c)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2282: in join_consumer
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5efaec8b0>
from_instance = <lib389.DirSrv object at 0x7fa5efaf1fd0>
to_instance = <lib389.DirSrv object at 0x7fa5efaf1220>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-consumer1 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39201, 'ldap-secureport': 63901, 'server-id': 'consumer1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:175 Joining consumer consumer1 from master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect fc030744-e0ec-4312-802a-5c2ad46517bb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect fc030744-e0ec-4312-802a-5c2ad46517bb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect fc030744-e0ec-4312-802a-5c2ad46517bb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect fc030744-e0ec-4312-802a-5c2ad46517bb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect fc030744-e0ec-4312-802a-5c2ad46517bb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect fc030744-e0ec-4312-802a-5c2ad46517bb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect fc030744-e0ec-4312-802a-5c2ad46517bb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect fc030744-e0ec-4312-802a-5c2ad46517bb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect fc030744-e0ec-4312-802a-5c2ad46517bb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect fc030744-e0ec-4312-802a-5c2ad46517bb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect fc030744-e0ec-4312-802a-5c2ad46517bb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect fc030744-e0ec-4312-802a-5c2ad46517bb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect fc030744-e0ec-4312-802a-5c2ad46517bb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect fc030744-e0ec-4312-802a-5c2ad46517bb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect fc030744-e0ec-4312-802a-5c2ad46517bb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect fc030744-e0ec-4312-802a-5c2ad46517bb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect fc030744-e0ec-4312-802a-5c2ad46517bb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect fc030744-e0ec-4312-802a-5c2ad46517bb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect fc030744-e0ec-4312-802a-5c2ad46517bb / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect fc030744-e0ec-4312-802a-5c2ad46517bb / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect fc030744-e0ec-4312-802a-5c2ad46517bb / got description=None)
Error tickets/ticket49460_test.py::test_ticket_49460::setup 45.96
request = <SubRequest 'topo' for <Function test_ticket_49460>>

@pytest.fixture(scope="module")
def topology_m3(request):
"""Create Replication Deployment with three masters"""

> topology = create_topology({ReplicaRole.MASTER: 3})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:453:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5efed67f0>
from_instance = <lib389.DirSrv object at 0x7fa5efad5640>
to_instance = <lib389.DirSrv object at 0x7fa5ef93cb80>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-master2 removed. Instance slapd-master3 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39003, 'ldap-secureport': 63703, 'server-id': 'master3', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:162 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d5b9652f-6e91-45c8-8138-d58515efc8c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d5b9652f-6e91-45c8-8138-d58515efc8c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d5b9652f-6e91-45c8-8138-d58515efc8c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d5b9652f-6e91-45c8-8138-d58515efc8c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d5b9652f-6e91-45c8-8138-d58515efc8c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d5b9652f-6e91-45c8-8138-d58515efc8c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d5b9652f-6e91-45c8-8138-d58515efc8c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d5b9652f-6e91-45c8-8138-d58515efc8c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d5b9652f-6e91-45c8-8138-d58515efc8c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d5b9652f-6e91-45c8-8138-d58515efc8c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d5b9652f-6e91-45c8-8138-d58515efc8c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d5b9652f-6e91-45c8-8138-d58515efc8c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d5b9652f-6e91-45c8-8138-d58515efc8c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d5b9652f-6e91-45c8-8138-d58515efc8c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d5b9652f-6e91-45c8-8138-d58515efc8c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d5b9652f-6e91-45c8-8138-d58515efc8c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d5b9652f-6e91-45c8-8138-d58515efc8c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d5b9652f-6e91-45c8-8138-d58515efc8c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d5b9652f-6e91-45c8-8138-d58515efc8c9 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d5b9652f-6e91-45c8-8138-d58515efc8c9 / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d5b9652f-6e91-45c8-8138-d58515efc8c9 / got description=None)
Error tickets/ticket49463_test.py::test_ticket_49463::setup 53.64
request = <SubRequest 'topo' for <Function test_ticket_49463>>

@pytest.fixture(scope="module")
def topology_m4(request):
"""Create Replication Deployment with four masters"""

> topology = create_topology({ReplicaRole.MASTER: 4})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:471:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5ef9522e0>
from_instance = <lib389.DirSrv object at 0x7fa5efaec6d0>
to_instance = <lib389.DirSrv object at 0x7fa5efa9f400>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-master2 removed. Instance slapd-master3 removed. Instance slapd-master4 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39003, 'ldap-secureport': 63703, 'server-id': 'master3', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39004, 'ldap-secureport': 63704, 'server-id': 'master4', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:162 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d6c03414-50dc-4aca-a6fc-dcc41d8d9a78 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d6c03414-50dc-4aca-a6fc-dcc41d8d9a78 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d6c03414-50dc-4aca-a6fc-dcc41d8d9a78 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d6c03414-50dc-4aca-a6fc-dcc41d8d9a78 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d6c03414-50dc-4aca-a6fc-dcc41d8d9a78 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d6c03414-50dc-4aca-a6fc-dcc41d8d9a78 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d6c03414-50dc-4aca-a6fc-dcc41d8d9a78 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d6c03414-50dc-4aca-a6fc-dcc41d8d9a78 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d6c03414-50dc-4aca-a6fc-dcc41d8d9a78 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d6c03414-50dc-4aca-a6fc-dcc41d8d9a78 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d6c03414-50dc-4aca-a6fc-dcc41d8d9a78 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d6c03414-50dc-4aca-a6fc-dcc41d8d9a78 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d6c03414-50dc-4aca-a6fc-dcc41d8d9a78 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d6c03414-50dc-4aca-a6fc-dcc41d8d9a78 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d6c03414-50dc-4aca-a6fc-dcc41d8d9a78 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d6c03414-50dc-4aca-a6fc-dcc41d8d9a78 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d6c03414-50dc-4aca-a6fc-dcc41d8d9a78 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d6c03414-50dc-4aca-a6fc-dcc41d8d9a78 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d6c03414-50dc-4aca-a6fc-dcc41d8d9a78 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d6c03414-50dc-4aca-a6fc-dcc41d8d9a78 / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d6c03414-50dc-4aca-a6fc-dcc41d8d9a78 / got description=None)
Error tickets/ticket49658_test.py::test_ticket49658_init::setup 44.26
request = <SubRequest 'topo' for <Function test_ticket49658_init>>

@pytest.fixture(scope="module")
def topology_m3(request):
"""Create Replication Deployment with three masters"""

> topology = create_topology({ReplicaRole.MASTER: 3})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:453:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5ef5f3cd0>
from_instance = <lib389.DirSrv object at 0x7fa5efbcf880>
to_instance = <lib389.DirSrv object at 0x7fa5efbd2ee0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master2 removed. Instance slapd-master3 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39003, 'ldap-secureport': 63703, 'server-id': 'master3', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology. INFO  lib389.topologies:topologies.py:162 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 48efadd3-7f09-412d-9419-9054cf6d8061 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 48efadd3-7f09-412d-9419-9054cf6d8061 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 48efadd3-7f09-412d-9419-9054cf6d8061 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 48efadd3-7f09-412d-9419-9054cf6d8061 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 48efadd3-7f09-412d-9419-9054cf6d8061 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 48efadd3-7f09-412d-9419-9054cf6d8061 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 48efadd3-7f09-412d-9419-9054cf6d8061 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 48efadd3-7f09-412d-9419-9054cf6d8061 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 48efadd3-7f09-412d-9419-9054cf6d8061 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 48efadd3-7f09-412d-9419-9054cf6d8061 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 48efadd3-7f09-412d-9419-9054cf6d8061 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 48efadd3-7f09-412d-9419-9054cf6d8061 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 48efadd3-7f09-412d-9419-9054cf6d8061 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 48efadd3-7f09-412d-9419-9054cf6d8061 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 48efadd3-7f09-412d-9419-9054cf6d8061 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 48efadd3-7f09-412d-9419-9054cf6d8061 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 48efadd3-7f09-412d-9419-9054cf6d8061 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 48efadd3-7f09-412d-9419-9054cf6d8061 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 48efadd3-7f09-412d-9419-9054cf6d8061 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 48efadd3-7f09-412d-9419-9054cf6d8061 / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 48efadd3-7f09-412d-9419-9054cf6d8061 / got description=None)
Error tickets/ticket49658_test.py::test_ticket49658_0::setup 0.00
request = <SubRequest 'topo' for <Function test_ticket49658_init>>

@pytest.fixture(scope="module")
def topology_m3(request):
"""Create Replication Deployment with three masters"""

> topology = create_topology({ReplicaRole.MASTER: 3})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:453:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5ef5f3cd0>
from_instance = <lib389.DirSrv object at 0x7fa5efbcf880>
to_instance = <lib389.DirSrv object at 0x7fa5efbd2ee0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket49658_test.py::test_ticket49658_1::setup 0.00
request = <SubRequest 'topo' for <Function test_ticket49658_init>>

@pytest.fixture(scope="module")
def topology_m3(request):
"""Create Replication Deployment with three masters"""

> topology = create_topology({ReplicaRole.MASTER: 3})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:453:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5ef5f3cd0>
from_instance = <lib389.DirSrv object at 0x7fa5efbcf880>
to_instance = <lib389.DirSrv object at 0x7fa5efbd2ee0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket49658_test.py::test_ticket49658_2::setup 0.00
request = <SubRequest 'topo' for <Function test_ticket49658_init>>

@pytest.fixture(scope="module")
def topology_m3(request):
"""Create Replication Deployment with three masters"""

> topology = create_topology({ReplicaRole.MASTER: 3})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:453:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5ef5f3cd0>
from_instance = <lib389.DirSrv object at 0x7fa5efbcf880>
to_instance = <lib389.DirSrv object at 0x7fa5efbd2ee0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket49658_test.py::test_ticket49658_3::setup 0.00
request = <SubRequest 'topo' for <Function test_ticket49658_init>>

@pytest.fixture(scope="module")
def topology_m3(request):
"""Create Replication Deployment with three masters"""

> topology = create_topology({ReplicaRole.MASTER: 3})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:453:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5ef5f3cd0>
from_instance = <lib389.DirSrv object at 0x7fa5efbcf880>
to_instance = <lib389.DirSrv object at 0x7fa5efbd2ee0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket49658_test.py::test_ticket49658_4::setup 0.00
request = <SubRequest 'topo' for <Function test_ticket49658_init>>

@pytest.fixture(scope="module")
def topology_m3(request):
"""Create Replication Deployment with three masters"""

> topology = create_topology({ReplicaRole.MASTER: 3})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:453:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5ef5f3cd0>
from_instance = <lib389.DirSrv object at 0x7fa5efbcf880>
to_instance = <lib389.DirSrv object at 0x7fa5efbd2ee0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket49658_test.py::test_ticket49658_5::setup 0.00
request = <SubRequest 'topo' for <Function test_ticket49658_init>>

@pytest.fixture(scope="module")
def topology_m3(request):
"""Create Replication Deployment with three masters"""

> topology = create_topology({ReplicaRole.MASTER: 3})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:453:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5ef5f3cd0>
from_instance = <lib389.DirSrv object at 0x7fa5efbcf880>
to_instance = <lib389.DirSrv object at 0x7fa5efbd2ee0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket49658_test.py::test_ticket49658_6::setup 0.00
request = <SubRequest 'topo' for <Function test_ticket49658_init>>

@pytest.fixture(scope="module")
def topology_m3(request):
"""Create Replication Deployment with three masters"""

> topology = create_topology({ReplicaRole.MASTER: 3})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:453:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5ef5f3cd0>
from_instance = <lib389.DirSrv object at 0x7fa5efbcf880>
to_instance = <lib389.DirSrv object at 0x7fa5efbd2ee0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket49658_test.py::test_ticket49658_7::setup 0.00
request = <SubRequest 'topo' for <Function test_ticket49658_init>>

@pytest.fixture(scope="module")
def topology_m3(request):
"""Create Replication Deployment with three masters"""

> topology = create_topology({ReplicaRole.MASTER: 3})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:453:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5ef5f3cd0>
from_instance = <lib389.DirSrv object at 0x7fa5efbcf880>
to_instance = <lib389.DirSrv object at 0x7fa5efbd2ee0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket49658_test.py::test_ticket49658_8::setup 0.00
request = <SubRequest 'topo' for <Function test_ticket49658_init>>

@pytest.fixture(scope="module")
def topology_m3(request):
"""Create Replication Deployment with three masters"""

> topology = create_topology({ReplicaRole.MASTER: 3})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:453:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5ef5f3cd0>
from_instance = <lib389.DirSrv object at 0x7fa5efbcf880>
to_instance = <lib389.DirSrv object at 0x7fa5efbd2ee0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket49658_test.py::test_ticket49658_9::setup 0.00
request = <SubRequest 'topo' for <Function test_ticket49658_init>>

@pytest.fixture(scope="module")
def topology_m3(request):
"""Create Replication Deployment with three masters"""

> topology = create_topology({ReplicaRole.MASTER: 3})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:453:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5ef5f3cd0>
from_instance = <lib389.DirSrv object at 0x7fa5efbcf880>
to_instance = <lib389.DirSrv object at 0x7fa5efbd2ee0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket49658_test.py::test_ticket49658_10::setup 0.00
request = <SubRequest 'topo' for <Function test_ticket49658_init>>

@pytest.fixture(scope="module")
def topology_m3(request):
"""Create Replication Deployment with three masters"""

> topology = create_topology({ReplicaRole.MASTER: 3})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:453:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5ef5f3cd0>
from_instance = <lib389.DirSrv object at 0x7fa5efbcf880>
to_instance = <lib389.DirSrv object at 0x7fa5efbd2ee0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket49658_test.py::test_ticket49658_11::setup 0.00
request = <SubRequest 'topo' for <Function test_ticket49658_init>>

@pytest.fixture(scope="module")
def topology_m3(request):
"""Create Replication Deployment with three masters"""

> topology = create_topology({ReplicaRole.MASTER: 3})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:453:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5ef5f3cd0>
from_instance = <lib389.DirSrv object at 0x7fa5efbcf880>
to_instance = <lib389.DirSrv object at 0x7fa5efbd2ee0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket49658_test.py::test_ticket49658_12::setup 0.00
request = <SubRequest 'topo' for <Function test_ticket49658_init>>

@pytest.fixture(scope="module")
def topology_m3(request):
"""Create Replication Deployment with three masters"""

> topology = create_topology({ReplicaRole.MASTER: 3})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:453:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5ef5f3cd0>
from_instance = <lib389.DirSrv object at 0x7fa5efbcf880>
to_instance = <lib389.DirSrv object at 0x7fa5efbd2ee0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket49658_test.py::test_ticket49658_13::setup 0.00
request = <SubRequest 'topo' for <Function test_ticket49658_init>>

@pytest.fixture(scope="module")
def topology_m3(request):
"""Create Replication Deployment with three masters"""

> topology = create_topology({ReplicaRole.MASTER: 3})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:453:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5ef5f3cd0>
from_instance = <lib389.DirSrv object at 0x7fa5efbcf880>
to_instance = <lib389.DirSrv object at 0x7fa5efbd2ee0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket49658_test.py::test_ticket49658_14::setup 0.00
request = <SubRequest 'topo' for <Function test_ticket49658_init>>

@pytest.fixture(scope="module")
def topology_m3(request):
"""Create Replication Deployment with three masters"""

> topology = create_topology({ReplicaRole.MASTER: 3})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:453:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5ef5f3cd0>
from_instance = <lib389.DirSrv object at 0x7fa5efbcf880>
to_instance = <lib389.DirSrv object at 0x7fa5efbd2ee0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket49658_test.py::test_ticket49658_15::setup 0.00
request = <SubRequest 'topo' for <Function test_ticket49658_init>>

@pytest.fixture(scope="module")
def topology_m3(request):
"""Create Replication Deployment with three masters"""

> topology = create_topology({ReplicaRole.MASTER: 3})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:453:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5ef5f3cd0>
from_instance = <lib389.DirSrv object at 0x7fa5efbcf880>
to_instance = <lib389.DirSrv object at 0x7fa5efbd2ee0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket49658_test.py::test_ticket49658_16::setup 0.00
request = <SubRequest 'topo' for <Function test_ticket49658_init>>

@pytest.fixture(scope="module")
def topology_m3(request):
"""Create Replication Deployment with three masters"""

> topology = create_topology({ReplicaRole.MASTER: 3})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:453:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5ef5f3cd0>
from_instance = <lib389.DirSrv object at 0x7fa5efbcf880>
to_instance = <lib389.DirSrv object at 0x7fa5efbd2ee0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket49658_test.py::test_ticket49658_17::setup 0.00
request = <SubRequest 'topo' for <Function test_ticket49658_init>>

@pytest.fixture(scope="module")
def topology_m3(request):
"""Create Replication Deployment with three masters"""

> topology = create_topology({ReplicaRole.MASTER: 3})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:453:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5ef5f3cd0>
from_instance = <lib389.DirSrv object at 0x7fa5efbcf880>
to_instance = <lib389.DirSrv object at 0x7fa5efbd2ee0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket49658_test.py::test_ticket49658_18::setup 0.00
request = <SubRequest 'topo' for <Function test_ticket49658_init>>

@pytest.fixture(scope="module")
def topology_m3(request):
"""Create Replication Deployment with three masters"""

> topology = create_topology({ReplicaRole.MASTER: 3})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:453:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5ef5f3cd0>
from_instance = <lib389.DirSrv object at 0x7fa5efbcf880>
to_instance = <lib389.DirSrv object at 0x7fa5efbd2ee0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket49658_test.py::test_ticket49658_19::setup 0.00
request = <SubRequest 'topo' for <Function test_ticket49658_init>>

@pytest.fixture(scope="module")
def topology_m3(request):
"""Create Replication Deployment with three masters"""

> topology = create_topology({ReplicaRole.MASTER: 3})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:453:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5ef5f3cd0>
from_instance = <lib389.DirSrv object at 0x7fa5efbcf880>
to_instance = <lib389.DirSrv object at 0x7fa5efbd2ee0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket49658_test.py::test_ticket49658_20::setup 0.00
request = <SubRequest 'topo' for <Function test_ticket49658_init>>

@pytest.fixture(scope="module")
def topology_m3(request):
"""Create Replication Deployment with three masters"""

> topology = create_topology({ReplicaRole.MASTER: 3})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:453:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5ef5f3cd0>
from_instance = <lib389.DirSrv object at 0x7fa5efbcf880>
to_instance = <lib389.DirSrv object at 0x7fa5efbd2ee0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket49658_test.py::test_ticket49658_21::setup 0.00
request = <SubRequest 'topo' for <Function test_ticket49658_init>>

@pytest.fixture(scope="module")
def topology_m3(request):
"""Create Replication Deployment with three masters"""

> topology = create_topology({ReplicaRole.MASTER: 3})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:453:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5ef5f3cd0>
from_instance = <lib389.DirSrv object at 0x7fa5efbcf880>
to_instance = <lib389.DirSrv object at 0x7fa5efbd2ee0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket49658_test.py::test_ticket49658_22::setup 0.00
request = <SubRequest 'topo' for <Function test_ticket49658_init>>

@pytest.fixture(scope="module")
def topology_m3(request):
"""Create Replication Deployment with three masters"""

> topology = create_topology({ReplicaRole.MASTER: 3})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:453:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5ef5f3cd0>
from_instance = <lib389.DirSrv object at 0x7fa5efbcf880>
to_instance = <lib389.DirSrv object at 0x7fa5efbd2ee0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket49658_test.py::test_ticket49658_23::setup 0.00
request = <SubRequest 'topo' for <Function test_ticket49658_init>>

@pytest.fixture(scope="module")
def topology_m3(request):
"""Create Replication Deployment with three masters"""

> topology = create_topology({ReplicaRole.MASTER: 3})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:453:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5ef5f3cd0>
from_instance = <lib389.DirSrv object at 0x7fa5efbcf880>
to_instance = <lib389.DirSrv object at 0x7fa5efbd2ee0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket49658_test.py::test_ticket49658_24::setup 0.00
request = <SubRequest 'topo' for <Function test_ticket49658_init>>

@pytest.fixture(scope="module")
def topology_m3(request):
"""Create Replication Deployment with three masters"""

> topology = create_topology({ReplicaRole.MASTER: 3})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:453:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5ef5f3cd0>
from_instance = <lib389.DirSrv object at 0x7fa5efbcf880>
to_instance = <lib389.DirSrv object at 0x7fa5efbd2ee0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket49658_test.py::test_ticket49658_25::setup 0.00
request = <SubRequest 'topo' for <Function test_ticket49658_init>>

@pytest.fixture(scope="module")
def topology_m3(request):
"""Create Replication Deployment with three masters"""

> topology = create_topology({ReplicaRole.MASTER: 3})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:453:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5ef5f3cd0>
from_instance = <lib389.DirSrv object at 0x7fa5efbcf880>
to_instance = <lib389.DirSrv object at 0x7fa5efbd2ee0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket49658_test.py::test_ticket49658_26::setup 0.00
request = <SubRequest 'topo' for <Function test_ticket49658_init>>

@pytest.fixture(scope="module")
def topology_m3(request):
"""Create Replication Deployment with three masters"""

> topology = create_topology({ReplicaRole.MASTER: 3})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:453:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5ef5f3cd0>
from_instance = <lib389.DirSrv object at 0x7fa5efbcf880>
to_instance = <lib389.DirSrv object at 0x7fa5efbd2ee0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket49658_test.py::test_ticket49658_27::setup 0.00
request = <SubRequest 'topo' for <Function test_ticket49658_init>>

@pytest.fixture(scope="module")
def topology_m3(request):
"""Create Replication Deployment with three masters"""

> topology = create_topology({ReplicaRole.MASTER: 3})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:453:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5ef5f3cd0>
from_instance = <lib389.DirSrv object at 0x7fa5efbcf880>
to_instance = <lib389.DirSrv object at 0x7fa5efbd2ee0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket49658_test.py::test_ticket49658_28::setup 0.00
request = <SubRequest 'topo' for <Function test_ticket49658_init>>

@pytest.fixture(scope="module")
def topology_m3(request):
"""Create Replication Deployment with three masters"""

> topology = create_topology({ReplicaRole.MASTER: 3})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:453:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5ef5f3cd0>
from_instance = <lib389.DirSrv object at 0x7fa5efbcf880>
to_instance = <lib389.DirSrv object at 0x7fa5efbd2ee0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket49658_test.py::test_ticket49658_29::setup 0.00
request = <SubRequest 'topo' for <Function test_ticket49658_init>>

@pytest.fixture(scope="module")
def topology_m3(request):
"""Create Replication Deployment with three masters"""

> topology = create_topology({ReplicaRole.MASTER: 3})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:453:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5ef5f3cd0>
from_instance = <lib389.DirSrv object at 0x7fa5efbcf880>
to_instance = <lib389.DirSrv object at 0x7fa5efbd2ee0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket49658_test.py::test_ticket49658_30::setup 0.00
request = <SubRequest 'topo' for <Function test_ticket49658_init>>

@pytest.fixture(scope="module")
def topology_m3(request):
"""Create Replication Deployment with three masters"""

> topology = create_topology({ReplicaRole.MASTER: 3})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:453:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5ef5f3cd0>
from_instance = <lib389.DirSrv object at 0x7fa5efbcf880>
to_instance = <lib389.DirSrv object at 0x7fa5efbd2ee0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket49658_test.py::test_ticket49658_31::setup 0.00
request = <SubRequest 'topo' for <Function test_ticket49658_init>>

@pytest.fixture(scope="module")
def topology_m3(request):
"""Create Replication Deployment with three masters"""

> topology = create_topology({ReplicaRole.MASTER: 3})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:453:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5ef5f3cd0>
from_instance = <lib389.DirSrv object at 0x7fa5efbcf880>
to_instance = <lib389.DirSrv object at 0x7fa5efbd2ee0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket49658_test.py::test_ticket49658_32::setup 0.00
request = <SubRequest 'topo' for <Function test_ticket49658_init>>

@pytest.fixture(scope="module")
def topology_m3(request):
"""Create Replication Deployment with three masters"""

> topology = create_topology({ReplicaRole.MASTER: 3})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:453:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5ef5f3cd0>
from_instance = <lib389.DirSrv object at 0x7fa5efbcf880>
to_instance = <lib389.DirSrv object at 0x7fa5efbd2ee0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket49658_test.py::test_ticket49658_33::setup 0.00
request = <SubRequest 'topo' for <Function test_ticket49658_init>>

@pytest.fixture(scope="module")
def topology_m3(request):
"""Create Replication Deployment with three masters"""

> topology = create_topology({ReplicaRole.MASTER: 3})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:453:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5ef5f3cd0>
from_instance = <lib389.DirSrv object at 0x7fa5efbcf880>
to_instance = <lib389.DirSrv object at 0x7fa5efbd2ee0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket49658_test.py::test_ticket49658_34::setup 0.00
request = <SubRequest 'topo' for <Function test_ticket49658_init>>

@pytest.fixture(scope="module")
def topology_m3(request):
"""Create Replication Deployment with three masters"""

> topology = create_topology({ReplicaRole.MASTER: 3})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:453:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5ef5f3cd0>
from_instance = <lib389.DirSrv object at 0x7fa5efbcf880>
to_instance = <lib389.DirSrv object at 0x7fa5efbd2ee0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
Error tickets/ticket50078_test.py::test_ticket50078::setup 48.19
request = <SubRequest 'topology_m1h1c1' for <Function test_ticket50078>>

@pytest.fixture(scope="module")
def topology_m1h1c1(request):
"""Create Replication Deployment with one master, one consumer and one hub"""

topo_roles = {ReplicaRole.MASTER: 1, ReplicaRole.HUB: 1, ReplicaRole.CONSUMER: 1}
topology = _create_instances(topo_roles, DEFAULT_SUFFIX)
master = topology.ms["master1"]
hub = topology.hs["hub1"]
consumer = topology.cs["consumer1"]

# Start with the master, and create it "first".
log.info("Creating replication topology.")
# Now get the first master ready.
repl = ReplicationManager(DEFAULT_SUFFIX)
repl.create_first_master(master)
# Finish the topology creation
> repl.join_hub(master, hub)

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:520:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2226: in join_hub
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5ef7a3af0>
from_instance = <lib389.DirSrv object at 0x7fa5ef7bc5b0>
to_instance = <lib389.DirSrv object at 0x7fa5ef7bcc70>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed. Instance slapd-hub1 removed. Instance slapd-consumer1 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39101, 'ldap-secureport': 63801, 'server-id': 'hub1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39201, 'ldap-secureport': 63901, 'server-id': 'consumer1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:515 Creating replication topology. INFO  lib389.replica:replica.py:2101 SUCCESS: bootstrap to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 completed INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is was created INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect d11fdbbc-a8dd-4229-945f-7959074bad32 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect d11fdbbc-a8dd-4229-945f-7959074bad32 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect d11fdbbc-a8dd-4229-945f-7959074bad32 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect d11fdbbc-a8dd-4229-945f-7959074bad32 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect d11fdbbc-a8dd-4229-945f-7959074bad32 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect d11fdbbc-a8dd-4229-945f-7959074bad32 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect d11fdbbc-a8dd-4229-945f-7959074bad32 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect d11fdbbc-a8dd-4229-945f-7959074bad32 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect d11fdbbc-a8dd-4229-945f-7959074bad32 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect d11fdbbc-a8dd-4229-945f-7959074bad32 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect d11fdbbc-a8dd-4229-945f-7959074bad32 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect d11fdbbc-a8dd-4229-945f-7959074bad32 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect d11fdbbc-a8dd-4229-945f-7959074bad32 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect d11fdbbc-a8dd-4229-945f-7959074bad32 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect d11fdbbc-a8dd-4229-945f-7959074bad32 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect d11fdbbc-a8dd-4229-945f-7959074bad32 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect d11fdbbc-a8dd-4229-945f-7959074bad32 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect d11fdbbc-a8dd-4229-945f-7959074bad32 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect d11fdbbc-a8dd-4229-945f-7959074bad32 / got description=None) INFO  lib389.replica:replica.py:2515 Retry: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect d11fdbbc-a8dd-4229-945f-7959074bad32 / got description=None) INFO  lib389.replica:replica.py:2517 FAIL: Replication from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect d11fdbbc-a8dd-4229-945f-7959074bad32 / got description=None)
Failed suites/basic/basic_test.py::test_dscreate 8.65
request = <FixtureRequest for <Function test_dscreate>>

@pytest.mark.skipif(not get_user_is_root() or not default_paths.perl_enabled or ds_is_older('1.4.0.0'),
reason="This test is only required if perl is enabled, and requires root.")
def test_dscreate(request):
"""Test that dscreate works, we need this for now until setup-ds.pl is
fully discontinued.

:id: 5bf75c47-a283-430e-a65c-3c5fd8dbadb9
:setup: None
:steps:
1. Create template file for dscreate
2. Create instance using template file
:expectedresults:
1. Should succeeds
2. Should succeeds
"""

template_file = "/tmp/dssetup.inf"
template_text = """[general]
config_version = 2
# This invalid hostname ...
full_machine_name = localhost.localdomain
# Means we absolutely require this.
strict_host_checking = False
# In tests, we can be run in containers, NEVER trust
# that systemd is there, or functional in any capacity
systemd = False

[slapd]
instance_name = test_dscreate
root_dn = cn=directory manager
root_password = someLongPassword_123
# We do not have access to high ports in containers,
# so default to something higher.
port = 38999
secure_port = 63699


[backend-userroot]
suffix = dc=example,dc=com
sample_entries = yes
"""

with open(template_file, "w") as template_fd:
template_fd.write(template_text)

# Unset PYTHONPATH to avoid mixing old CLI tools and new lib389
tmp_env = os.environ
if "PYTHONPATH" in tmp_env:
del tmp_env["PYTHONPATH"]
try:
> subprocess.check_call([
'dscreate',
'from-file',
template_file
], env=tmp_env)

suites/basic/basic_test.py:1256:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

popenargs = (['dscreate', 'from-file', '/tmp/dssetup.inf'],)
kwargs = {'env': environ({'SHELL': '/bin/bash', 'DS_SKIP_LABEL': '1', '__INTERNAL_PERSISTENT_DATA': '/var/tmp/beakerlib-8592800...n/py.test', 'FAKETIME_DID_REEXEC': 'true', 'PYTEST_CURRENT_TEST': 'suites/basic/basic_test.py::test_dscreate (call)'})}
retcode = 1, cmd = ['dscreate', 'from-file', '/tmp/dssetup.inf']

def check_call(*popenargs, **kwargs):
"""Run command with arguments. Wait for command to complete. If
the exit code was zero then return, otherwise raise
CalledProcessError. The CalledProcessError object will have the
return code in the returncode attribute.

The arguments are the same as for the call function. Example:

check_call(["ls", "-l"])
"""
retcode = call(*popenargs, **kwargs)
if retcode:
cmd = kwargs.get("args")
if cmd is None:
cmd = popenargs[0]
> raise CalledProcessError(retcode, cmd)
E subprocess.CalledProcessError: Command '['dscreate', 'from-file', '/tmp/dssetup.inf']' returned non-zero exit status 1.

/usr/lib64/python3.8/subprocess.py:364: CalledProcessError

During handling of the above exception, another exception occurred:

request = <FixtureRequest for <Function test_dscreate>>

@pytest.mark.skipif(not get_user_is_root() or not default_paths.perl_enabled or ds_is_older('1.4.0.0'),
reason="This test is only required if perl is enabled, and requires root.")
def test_dscreate(request):
"""Test that dscreate works, we need this for now until setup-ds.pl is
fully discontinued.

:id: 5bf75c47-a283-430e-a65c-3c5fd8dbadb9
:setup: None
:steps:
1. Create template file for dscreate
2. Create instance using template file
:expectedresults:
1. Should succeeds
2. Should succeeds
"""

template_file = "/tmp/dssetup.inf"
template_text = """[general]
config_version = 2
# This invalid hostname ...
full_machine_name = localhost.localdomain
# Means we absolutely require this.
strict_host_checking = False
# In tests, we can be run in containers, NEVER trust
# that systemd is there, or functional in any capacity
systemd = False

[slapd]
instance_name = test_dscreate
root_dn = cn=directory manager
root_password = someLongPassword_123
# We do not have access to high ports in containers,
# so default to something higher.
port = 38999
secure_port = 63699


[backend-userroot]
suffix = dc=example,dc=com
sample_entries = yes
"""

with open(template_file, "w") as template_fd:
template_fd.write(template_text)

# Unset PYTHONPATH to avoid mixing old CLI tools and new lib389
tmp_env = os.environ
if "PYTHONPATH" in tmp_env:
del tmp_env["PYTHONPATH"]
try:
subprocess.check_call([
'dscreate',
'from-file',
template_file
], env=tmp_env)
except subprocess.CalledProcessError as e:
log.fatal("dscreate failed! Error ({}) {}".format(e.returncode, e.output))
> assert False
E assert False

suites/basic/basic_test.py:1263: AssertionError
------------------------------Captured stdout call------------------------------
Starting installation... Error: -1 - Can't contact LDAP server - 2 - [] - No such file or directory
-------------------------------Captured log call--------------------------------
CRITICAL tests.suites.basic.basic_test:basic_test.py:1262 dscreate failed! Error (1) None
Failed suites/basic/basic_test.py::test_dscreate_with_different_rdn[c=uk] 0.39
dscreate_test_rdn_value = ('/tmp/dssetup.inf', environ({'SHELL': '/bin/bash', 'DS_SKIP_LABEL': '1', '__INTERNAL_PERSISTENT_DATA': '/var/tmp/beak...: 'true', 'PYTEST_CURRENT_TEST': 'suites/basic/basic_test.py::test_dscreate_with_different_rdn[c=uk] (call)'}), 'c=uk')

@pytest.mark.skipif(not get_user_is_root() or ds_is_older('1.4.0.0'),
reason="This test is only required with new admin cli, and requires root.")
@pytest.mark.bz1807419
@pytest.mark.ds50928
def test_dscreate_with_different_rdn(dscreate_test_rdn_value):
"""Test that dscreate works with different RDN attributes as suffix

:id: 77ed6300-6a2f-4e79-a862-1f1105f1e3ef
:parametrized: yes
:setup: None
:steps:
1. Create template file for dscreate with different RDN attributes as suffix
2. Create instance using template file
3. Create instance with 'wrong=some_value' as suffix's RDN attribute
:expectedresults:
1. Should succeeds
2. Should succeeds
3. Should fail
"""
try:
> subprocess.check_call([
'dscreate',
'from-file',
dscreate_test_rdn_value[0]
], env=dscreate_test_rdn_value[1])

suites/basic/basic_test.py:1468:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

popenargs = (['dscreate', 'from-file', '/tmp/dssetup.inf'],)
kwargs = {'env': environ({'SHELL': '/bin/bash', 'DS_SKIP_LABEL': '1', '__INTERNAL_PERSISTENT_DATA': '/var/tmp/beakerlib-8592800..._REEXEC': 'true', 'PYTEST_CURRENT_TEST': 'suites/basic/basic_test.py::test_dscreate_with_different_rdn[c=uk] (call)'})}
retcode = 1, cmd = ['dscreate', 'from-file', '/tmp/dssetup.inf']

def check_call(*popenargs, **kwargs):
"""Run command with arguments. Wait for command to complete. If
the exit code was zero then return, otherwise raise
CalledProcessError. The CalledProcessError object will have the
return code in the returncode attribute.

The arguments are the same as for the call function. Example:

check_call(["ls", "-l"])
"""
retcode = call(*popenargs, **kwargs)
if retcode:
cmd = kwargs.get("args")
if cmd is None:
cmd = popenargs[0]
> raise CalledProcessError(retcode, cmd)
E subprocess.CalledProcessError: Command '['dscreate', 'from-file', '/tmp/dssetup.inf']' returned non-zero exit status 1.

/usr/lib64/python3.8/subprocess.py:364: CalledProcessError

During handling of the above exception, another exception occurred:

dscreate_test_rdn_value = ('/tmp/dssetup.inf', environ({'SHELL': '/bin/bash', 'DS_SKIP_LABEL': '1', '__INTERNAL_PERSISTENT_DATA': '/var/tmp/beak...: 'true', 'PYTEST_CURRENT_TEST': 'suites/basic/basic_test.py::test_dscreate_with_different_rdn[c=uk] (call)'}), 'c=uk')

@pytest.mark.skipif(not get_user_is_root() or ds_is_older('1.4.0.0'),
reason="This test is only required with new admin cli, and requires root.")
@pytest.mark.bz1807419
@pytest.mark.ds50928
def test_dscreate_with_different_rdn(dscreate_test_rdn_value):
"""Test that dscreate works with different RDN attributes as suffix

:id: 77ed6300-6a2f-4e79-a862-1f1105f1e3ef
:parametrized: yes
:setup: None
:steps:
1. Create template file for dscreate with different RDN attributes as suffix
2. Create instance using template file
3. Create instance with 'wrong=some_value' as suffix's RDN attribute
:expectedresults:
1. Should succeeds
2. Should succeeds
3. Should fail
"""
try:
subprocess.check_call([
'dscreate',
'from-file',
dscreate_test_rdn_value[0]
], env=dscreate_test_rdn_value[1])
except subprocess.CalledProcessError as e:
log.fatal(f"dscreate failed! Error ({e.returncode}) {e.output}")
if dscreate_test_rdn_value[2] != "wrong=some_value":
> assert False
E assert False

suites/basic/basic_test.py:1476: AssertionError
------------------------------Captured stdout call------------------------------
Starting installation... Error: port 38999 is already in use, or missing NET_BIND_SERVICE
-------------------------------Captured log call--------------------------------
CRITICAL tests.suites.basic.basic_test:basic_test.py:1474 dscreate failed! Error (1) None
Failed suites/basic/basic_test.py::test_dscreate_with_different_rdn[cn=test_user] 0.42
dscreate_test_rdn_value = ('/tmp/dssetup.inf', environ({'SHELL': '/bin/bash', 'DS_SKIP_LABEL': '1', '__INTERNAL_PERSISTENT_DATA': '/var/tmp/beak...T_CURRENT_TEST': 'suites/basic/basic_test.py::test_dscreate_with_different_rdn[cn=test_user] (call)'}), 'cn=test_user')

@pytest.mark.skipif(not get_user_is_root() or ds_is_older('1.4.0.0'),
reason="This test is only required with new admin cli, and requires root.")
@pytest.mark.bz1807419
@pytest.mark.ds50928
def test_dscreate_with_different_rdn(dscreate_test_rdn_value):
"""Test that dscreate works with different RDN attributes as suffix

:id: 77ed6300-6a2f-4e79-a862-1f1105f1e3ef
:parametrized: yes
:setup: None
:steps:
1. Create template file for dscreate with different RDN attributes as suffix
2. Create instance using template file
3. Create instance with 'wrong=some_value' as suffix's RDN attribute
:expectedresults:
1. Should succeeds
2. Should succeeds
3. Should fail
"""
try:
> subprocess.check_call([
'dscreate',
'from-file',
dscreate_test_rdn_value[0]
], env=dscreate_test_rdn_value[1])

suites/basic/basic_test.py:1468:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

popenargs = (['dscreate', 'from-file', '/tmp/dssetup.inf'],)
kwargs = {'env': environ({'SHELL': '/bin/bash', 'DS_SKIP_LABEL': '1', '__INTERNAL_PERSISTENT_DATA': '/var/tmp/beakerlib-8592800...: 'true', 'PYTEST_CURRENT_TEST': 'suites/basic/basic_test.py::test_dscreate_with_different_rdn[cn=test_user] (call)'})}
retcode = 1, cmd = ['dscreate', 'from-file', '/tmp/dssetup.inf']

def check_call(*popenargs, **kwargs):
"""Run command with arguments. Wait for command to complete. If
the exit code was zero then return, otherwise raise
CalledProcessError. The CalledProcessError object will have the
return code in the returncode attribute.

The arguments are the same as for the call function. Example:

check_call(["ls", "-l"])
"""
retcode = call(*popenargs, **kwargs)
if retcode:
cmd = kwargs.get("args")
if cmd is None:
cmd = popenargs[0]
> raise CalledProcessError(retcode, cmd)
E subprocess.CalledProcessError: Command '['dscreate', 'from-file', '/tmp/dssetup.inf']' returned non-zero exit status 1.

/usr/lib64/python3.8/subprocess.py:364: CalledProcessError

During handling of the above exception, another exception occurred:

dscreate_test_rdn_value = ('/tmp/dssetup.inf', environ({'SHELL': '/bin/bash', 'DS_SKIP_LABEL': '1', '__INTERNAL_PERSISTENT_DATA': '/var/tmp/beak...T_CURRENT_TEST': 'suites/basic/basic_test.py::test_dscreate_with_different_rdn[cn=test_user] (call)'}), 'cn=test_user')

@pytest.mark.skipif(not get_user_is_root() or ds_is_older('1.4.0.0'),
reason="This test is only required with new admin cli, and requires root.")
@pytest.mark.bz1807419
@pytest.mark.ds50928
def test_dscreate_with_different_rdn(dscreate_test_rdn_value):
"""Test that dscreate works with different RDN attributes as suffix

:id: 77ed6300-6a2f-4e79-a862-1f1105f1e3ef
:parametrized: yes
:setup: None
:steps:
1. Create template file for dscreate with different RDN attributes as suffix
2. Create instance using template file
3. Create instance with 'wrong=some_value' as suffix's RDN attribute
:expectedresults:
1. Should succeeds
2. Should succeeds
3. Should fail
"""
try:
subprocess.check_call([
'dscreate',
'from-file',
dscreate_test_rdn_value[0]
], env=dscreate_test_rdn_value[1])
except subprocess.CalledProcessError as e:
log.fatal(f"dscreate failed! Error ({e.returncode}) {e.output}")
if dscreate_test_rdn_value[2] != "wrong=some_value":
> assert False
E assert False

suites/basic/basic_test.py:1476: AssertionError
-----------------------------Captured stdout setup------------------------------
No such instance 'test_different_rdn' Unable to access instance information. Are you running as the correct user? (usually dirsrv or root)
-------------------------------Captured log setup-------------------------------
CRITICAL tests.suites.basic.basic_test:basic_test.py:1441 Failed to remove test instance Error (1) None
------------------------------Captured stdout call------------------------------
Starting installation... Error: port 38999 is already in use, or missing NET_BIND_SERVICE
-------------------------------Captured log call--------------------------------
CRITICAL tests.suites.basic.basic_test:basic_test.py:1474 dscreate failed! Error (1) None
Failed suites/basic/basic_test.py::test_dscreate_with_different_rdn[dc=example,dc=com] 0.45
dscreate_test_rdn_value = ('/tmp/dssetup.inf', environ({'SHELL': '/bin/bash', 'DS_SKIP_LABEL': '1', '__INTERNAL_PERSISTENT_DATA': '/var/tmp/beak...TEST': 'suites/basic/basic_test.py::test_dscreate_with_different_rdn[dc=example,dc=com] (call)'}), 'dc=example,dc=com')

@pytest.mark.skipif(not get_user_is_root() or ds_is_older('1.4.0.0'),
reason="This test is only required with new admin cli, and requires root.")
@pytest.mark.bz1807419
@pytest.mark.ds50928
def test_dscreate_with_different_rdn(dscreate_test_rdn_value):
"""Test that dscreate works with different RDN attributes as suffix

:id: 77ed6300-6a2f-4e79-a862-1f1105f1e3ef
:parametrized: yes
:setup: None
:steps:
1. Create template file for dscreate with different RDN attributes as suffix
2. Create instance using template file
3. Create instance with 'wrong=some_value' as suffix's RDN attribute
:expectedresults:
1. Should succeeds
2. Should succeeds
3. Should fail
"""
try:
> subprocess.check_call([
'dscreate',
'from-file',
dscreate_test_rdn_value[0]
], env=dscreate_test_rdn_value[1])

suites/basic/basic_test.py:1468:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

popenargs = (['dscreate', 'from-file', '/tmp/dssetup.inf'],)
kwargs = {'env': environ({'SHELL': '/bin/bash', 'DS_SKIP_LABEL': '1', '__INTERNAL_PERSISTENT_DATA': '/var/tmp/beakerlib-8592800...ue', 'PYTEST_CURRENT_TEST': 'suites/basic/basic_test.py::test_dscreate_with_different_rdn[dc=example,dc=com] (call)'})}
retcode = 1, cmd = ['dscreate', 'from-file', '/tmp/dssetup.inf']

def check_call(*popenargs, **kwargs):
"""Run command with arguments. Wait for command to complete. If
the exit code was zero then return, otherwise raise
CalledProcessError. The CalledProcessError object will have the
return code in the returncode attribute.

The arguments are the same as for the call function. Example:

check_call(["ls", "-l"])
"""
retcode = call(*popenargs, **kwargs)
if retcode:
cmd = kwargs.get("args")
if cmd is None:
cmd = popenargs[0]
> raise CalledProcessError(retcode, cmd)
E subprocess.CalledProcessError: Command '['dscreate', 'from-file', '/tmp/dssetup.inf']' returned non-zero exit status 1.

/usr/lib64/python3.8/subprocess.py:364: CalledProcessError

During handling of the above exception, another exception occurred:

dscreate_test_rdn_value = ('/tmp/dssetup.inf', environ({'SHELL': '/bin/bash', 'DS_SKIP_LABEL': '1', '__INTERNAL_PERSISTENT_DATA': '/var/tmp/beak...TEST': 'suites/basic/basic_test.py::test_dscreate_with_different_rdn[dc=example,dc=com] (call)'}), 'dc=example,dc=com')

@pytest.mark.skipif(not get_user_is_root() or ds_is_older('1.4.0.0'),
reason="This test is only required with new admin cli, and requires root.")
@pytest.mark.bz1807419
@pytest.mark.ds50928
def test_dscreate_with_different_rdn(dscreate_test_rdn_value):
"""Test that dscreate works with different RDN attributes as suffix

:id: 77ed6300-6a2f-4e79-a862-1f1105f1e3ef
:parametrized: yes
:setup: None
:steps:
1. Create template file for dscreate with different RDN attributes as suffix
2. Create instance using template file
3. Create instance with 'wrong=some_value' as suffix's RDN attribute
:expectedresults:
1. Should succeeds
2. Should succeeds
3. Should fail
"""
try:
subprocess.check_call([
'dscreate',
'from-file',
dscreate_test_rdn_value[0]
], env=dscreate_test_rdn_value[1])
except subprocess.CalledProcessError as e:
log.fatal(f"dscreate failed! Error ({e.returncode}) {e.output}")
if dscreate_test_rdn_value[2] != "wrong=some_value":
> assert False
E assert False

suites/basic/basic_test.py:1476: AssertionError
-----------------------------Captured stdout setup------------------------------
No such instance 'test_different_rdn' Unable to access instance information. Are you running as the correct user? (usually dirsrv or root)
-------------------------------Captured log setup-------------------------------
CRITICAL tests.suites.basic.basic_test:basic_test.py:1441 Failed to remove test instance Error (1) None
------------------------------Captured stdout call------------------------------
Starting installation... Error: port 38999 is already in use, or missing NET_BIND_SERVICE
-------------------------------Captured log call--------------------------------
CRITICAL tests.suites.basic.basic_test:basic_test.py:1474 dscreate failed! Error (1) None
Failed suites/basic/basic_test.py::test_dscreate_with_different_rdn[o=south] 0.53
dscreate_test_rdn_value = ('/tmp/dssetup.inf', environ({'SHELL': '/bin/bash', 'DS_SKIP_LABEL': '1', '__INTERNAL_PERSISTENT_DATA': '/var/tmp/beak...e', 'PYTEST_CURRENT_TEST': 'suites/basic/basic_test.py::test_dscreate_with_different_rdn[o=south] (call)'}), 'o=south')

@pytest.mark.skipif(not get_user_is_root() or ds_is_older('1.4.0.0'),
reason="This test is only required with new admin cli, and requires root.")
@pytest.mark.bz1807419
@pytest.mark.ds50928
def test_dscreate_with_different_rdn(dscreate_test_rdn_value):
"""Test that dscreate works with different RDN attributes as suffix

:id: 77ed6300-6a2f-4e79-a862-1f1105f1e3ef
:parametrized: yes
:setup: None
:steps:
1. Create template file for dscreate with different RDN attributes as suffix
2. Create instance using template file
3. Create instance with 'wrong=some_value' as suffix's RDN attribute
:expectedresults:
1. Should succeeds
2. Should succeeds
3. Should fail
"""
try:
> subprocess.check_call([
'dscreate',
'from-file',
dscreate_test_rdn_value[0]
], env=dscreate_test_rdn_value[1])

suites/basic/basic_test.py:1468:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

popenargs = (['dscreate', 'from-file', '/tmp/dssetup.inf'],)
kwargs = {'env': environ({'SHELL': '/bin/bash', 'DS_SKIP_LABEL': '1', '__INTERNAL_PERSISTENT_DATA': '/var/tmp/beakerlib-8592800...EXEC': 'true', 'PYTEST_CURRENT_TEST': 'suites/basic/basic_test.py::test_dscreate_with_different_rdn[o=south] (call)'})}
retcode = 1, cmd = ['dscreate', 'from-file', '/tmp/dssetup.inf']

def check_call(*popenargs, **kwargs):
"""Run command with arguments. Wait for command to complete. If
the exit code was zero then return, otherwise raise
CalledProcessError. The CalledProcessError object will have the
return code in the returncode attribute.

The arguments are the same as for the call function. Example:

check_call(["ls", "-l"])
"""
retcode = call(*popenargs, **kwargs)
if retcode:
cmd = kwargs.get("args")
if cmd is None:
cmd = popenargs[0]
> raise CalledProcessError(retcode, cmd)
E subprocess.CalledProcessError: Command '['dscreate', 'from-file', '/tmp/dssetup.inf']' returned non-zero exit status 1.

/usr/lib64/python3.8/subprocess.py:364: CalledProcessError

During handling of the above exception, another exception occurred:

dscreate_test_rdn_value = ('/tmp/dssetup.inf', environ({'SHELL': '/bin/bash', 'DS_SKIP_LABEL': '1', '__INTERNAL_PERSISTENT_DATA': '/var/tmp/beak...e', 'PYTEST_CURRENT_TEST': 'suites/basic/basic_test.py::test_dscreate_with_different_rdn[o=south] (call)'}), 'o=south')

@pytest.mark.skipif(not get_user_is_root() or ds_is_older('1.4.0.0'),
reason="This test is only required with new admin cli, and requires root.")
@pytest.mark.bz1807419
@pytest.mark.ds50928
def test_dscreate_with_different_rdn(dscreate_test_rdn_value):
"""Test that dscreate works with different RDN attributes as suffix

:id: 77ed6300-6a2f-4e79-a862-1f1105f1e3ef
:parametrized: yes
:setup: None
:steps:
1. Create template file for dscreate with different RDN attributes as suffix
2. Create instance using template file
3. Create instance with 'wrong=some_value' as suffix's RDN attribute
:expectedresults:
1. Should succeeds
2. Should succeeds
3. Should fail
"""
try:
subprocess.check_call([
'dscreate',
'from-file',
dscreate_test_rdn_value[0]
], env=dscreate_test_rdn_value[1])
except subprocess.CalledProcessError as e:
log.fatal(f"dscreate failed! Error ({e.returncode}) {e.output}")
if dscreate_test_rdn_value[2] != "wrong=some_value":
> assert False
E assert False

suites/basic/basic_test.py:1476: AssertionError
-----------------------------Captured stdout setup------------------------------
No such instance 'test_different_rdn' Unable to access instance information. Are you running as the correct user? (usually dirsrv or root)
-------------------------------Captured log setup-------------------------------
CRITICAL tests.suites.basic.basic_test:basic_test.py:1441 Failed to remove test instance Error (1) None
------------------------------Captured stdout call------------------------------
Starting installation... Error: port 38999 is already in use, or missing NET_BIND_SERVICE
-------------------------------Captured log call--------------------------------
CRITICAL tests.suites.basic.basic_test:basic_test.py:1474 dscreate failed! Error (1) None
Failed suites/basic/basic_test.py::test_dscreate_with_different_rdn[ou=sales] 0.43
dscreate_test_rdn_value = ('/tmp/dssetup.inf', environ({'SHELL': '/bin/bash', 'DS_SKIP_LABEL': '1', '__INTERNAL_PERSISTENT_DATA': '/var/tmp/beak..., 'PYTEST_CURRENT_TEST': 'suites/basic/basic_test.py::test_dscreate_with_different_rdn[ou=sales] (call)'}), 'ou=sales')

@pytest.mark.skipif(not get_user_is_root() or ds_is_older('1.4.0.0'),
reason="This test is only required with new admin cli, and requires root.")
@pytest.mark.bz1807419
@pytest.mark.ds50928
def test_dscreate_with_different_rdn(dscreate_test_rdn_value):
"""Test that dscreate works with different RDN attributes as suffix

:id: 77ed6300-6a2f-4e79-a862-1f1105f1e3ef
:parametrized: yes
:setup: None
:steps:
1. Create template file for dscreate with different RDN attributes as suffix
2. Create instance using template file
3. Create instance with 'wrong=some_value' as suffix's RDN attribute
:expectedresults:
1. Should succeeds
2. Should succeeds
3. Should fail
"""
try:
> subprocess.check_call([
'dscreate',
'from-file',
dscreate_test_rdn_value[0]
], env=dscreate_test_rdn_value[1])

suites/basic/basic_test.py:1468:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

popenargs = (['dscreate', 'from-file', '/tmp/dssetup.inf'],)
kwargs = {'env': environ({'SHELL': '/bin/bash', 'DS_SKIP_LABEL': '1', '__INTERNAL_PERSISTENT_DATA': '/var/tmp/beakerlib-8592800...XEC': 'true', 'PYTEST_CURRENT_TEST': 'suites/basic/basic_test.py::test_dscreate_with_different_rdn[ou=sales] (call)'})}
retcode = 1, cmd = ['dscreate', 'from-file', '/tmp/dssetup.inf']

def check_call(*popenargs, **kwargs):
"""Run command with arguments. Wait for command to complete. If
the exit code was zero then return, otherwise raise
CalledProcessError. The CalledProcessError object will have the
return code in the returncode attribute.

The arguments are the same as for the call function. Example:

check_call(["ls", "-l"])
"""
retcode = call(*popenargs, **kwargs)
if retcode:
cmd = kwargs.get("args")
if cmd is None:
cmd = popenargs[0]
> raise CalledProcessError(retcode, cmd)
E subprocess.CalledProcessError: Command '['dscreate', 'from-file', '/tmp/dssetup.inf']' returned non-zero exit status 1.

/usr/lib64/python3.8/subprocess.py:364: CalledProcessError

During handling of the above exception, another exception occurred:

dscreate_test_rdn_value = ('/tmp/dssetup.inf', environ({'SHELL': '/bin/bash', 'DS_SKIP_LABEL': '1', '__INTERNAL_PERSISTENT_DATA': '/var/tmp/beak..., 'PYTEST_CURRENT_TEST': 'suites/basic/basic_test.py::test_dscreate_with_different_rdn[ou=sales] (call)'}), 'ou=sales')

@pytest.mark.skipif(not get_user_is_root() or ds_is_older('1.4.0.0'),
reason="This test is only required with new admin cli, and requires root.")
@pytest.mark.bz1807419
@pytest.mark.ds50928
def test_dscreate_with_different_rdn(dscreate_test_rdn_value):
"""Test that dscreate works with different RDN attributes as suffix

:id: 77ed6300-6a2f-4e79-a862-1f1105f1e3ef
:parametrized: yes
:setup: None
:steps:
1. Create template file for dscreate with different RDN attributes as suffix
2. Create instance using template file
3. Create instance with 'wrong=some_value' as suffix's RDN attribute
:expectedresults:
1. Should succeeds
2. Should succeeds
3. Should fail
"""
try:
subprocess.check_call([
'dscreate',
'from-file',
dscreate_test_rdn_value[0]
], env=dscreate_test_rdn_value[1])
except subprocess.CalledProcessError as e:
log.fatal(f"dscreate failed! Error ({e.returncode}) {e.output}")
if dscreate_test_rdn_value[2] != "wrong=some_value":
> assert False
E assert False

suites/basic/basic_test.py:1476: AssertionError
-----------------------------Captured stdout setup------------------------------
No such instance 'test_different_rdn' Unable to access instance information. Are you running as the correct user? (usually dirsrv or root)
-------------------------------Captured log setup-------------------------------
CRITICAL tests.suites.basic.basic_test:basic_test.py:1441 Failed to remove test instance Error (1) None
------------------------------Captured stdout call------------------------------
Starting installation... Error: port 38999 is already in use, or missing NET_BIND_SERVICE
-------------------------------Captured log call--------------------------------
CRITICAL tests.suites.basic.basic_test:basic_test.py:1474 dscreate failed! Error (1) None
Failed suites/password/pwdPolicy_syntax_test.py::test_basic 6.34
topology_st = <lib389.topologies.TopologyMain object at 0x7fa5f667a2b0>
create_user = None, password_policy = None

def test_basic(topology_st, create_user, password_policy):
"""Ensure that on a password change, the policy syntax
is enforced correctly.

:id: e8de7029-7fa6-4e96-9eb6-4a121f4c8fb3
:setup: Standalone instance, a test user,
global password policy with:
passwordCheckSyntax - on; nsslapd-pwpolicy-local - off;
passwordMinCategories - 1
:steps:
1. Set passwordMinLength to 10 in cn=config
2. Set userPassword to 'passwd' in cn=config
3. Set userPassword to 'password123' in cn=config
4. Set passwordMinLength to 2 in cn=config
5. Set passwordMinDigits to 2 in cn=config
6. Set userPassword to 'passwd' in cn=config
7. Set userPassword to 'password123' in cn=config
8. Set passwordMinDigits to 0 in cn=config
9. Set passwordMinAlphas to 2 in cn=config
10. Set userPassword to 'p123456789' in cn=config
11. Set userPassword to 'password123' in cn=config
12. Set passwordMinAlphas to 0 in cn=config
13. Set passwordMaxRepeats to 2 in cn=config
14. Set userPassword to 'password' in cn=config
15. Set userPassword to 'password123' in cn=config
16. Set passwordMaxRepeats to 0 in cn=config
17. Set passwordMinSpecials to 2 in cn=config
18. Set userPassword to 'passwd' in cn=config
19. Set userPassword to 'password_#$' in cn=config
20. Set passwordMinSpecials to 0 in cn=config
21. Set passwordMinLowers to 2 in cn=config
22. Set userPassword to 'PASSWORD123' in cn=config
23. Set userPassword to 'password123' in cn=config
24. Set passwordMinLowers to 0 in cn=config
25. Set passwordMinUppers to 2 in cn=config
26. Set userPassword to 'password' in cn=config
27. Set userPassword to 'PASSWORD' in cn=config
28. Set passwordMinUppers to 0 in cn=config
29. Test passwordDictCheck
30. Test passwordPalindrome
31. Test passwordMaxSequence for forward number sequence
32. Test passwordMaxSequence for backward number sequence
33. Test passwordMaxSequence for forward alpha sequence
34. Test passwordMaxSequence for backward alpha sequence
35. Test passwordMaxClassChars for digits
36. Test passwordMaxClassChars for specials
37. Test passwordMaxClassChars for lowers
38. Test passwordMaxClassChars for uppers
39. Test passwordBadWords using 'redhat' and 'fedora'
40. Test passwordUserAttrs using description attribute

:expectedresults:
1. passwordMinLength should be successfully set
2. Password should be rejected because length too short
3. Password should be accepted
4. passwordMinLength should be successfully set
5. passwordMinDigits should be successfully set
6. Password should be rejected because
it does not contain minimum number of digits
7. Password should be accepted
8. passwordMinDigits should be successfully set
9. passwordMinAlphas should be successfully set
10. Password should be rejected because
it does not contain minimum number of alphas
11. Password should be accepted
12. passwordMinAlphas should be successfully set
13. passwordMaxRepeats should be successfully set
14. Password should be rejected because too many repeating characters
15. Password should be accepted
16. passwordMaxRepeats should be successfully set
17. passwordMinSpecials should be successfully set
18. Password should be rejected because
it does not contain minimum number of special characters
19. Password should be accepted
20. passwordMinSpecials should be successfully set
21. passwordMinLowers should be successfully set
22. Password should be rejected because
it does not contain minimum number of lowercase characters
23. Password should be accepted
24. passwordMinLowers should be successfully set
25. passwordMinUppers should be successfully set
26. Password should be rejected because
it does not contain minimum number of lowercase characters
27. Password should be accepted
28. passwordMinUppers should be successfully set
29. The passwordDictCheck test succeeds
30. The passwordPalindrome test succeeds
31. Test passwordMaxSequence for forward number sequence succeeds
32. Test passwordMaxSequence for backward number sequence succeeds
33. Test passwordMaxSequence for forward alpha sequence succeeds
34. Test passwordMaxSequence for backward alpha sequence succeeds
35. Test passwordMaxClassChars for digits succeeds
36. Test passwordMaxClassChars for specials succeeds
37. Test passwordMaxClassChars for lowers succeeds
38. Test passwordMaxClassChars for uppers succeeds
39. The passwordBadWords test succeeds
40. The passwordUserAttrs test succeeds
"""

#
# Test each syntax category
#
ous = OrganizationalUnits(topology_st.standalone, DEFAULT_SUFFIX)
ou = ous.get('people')
ou.add('aci', USER_ACI)

# Min Length
tryPassword(topology_st.standalone, 'passwordMinLength', 10, 2, 'passwd',
'password123', 'length too short')
# Min Digit
tryPassword(topology_st.standalone, 'passwordMinDigits', 2, 0, 'passwd',
'password123', 'does not contain minimum number of digits')
# Min Alphas
tryPassword(topology_st.standalone, 'passwordMinAlphas', 2, 0, 'p123456789',
'password123', 'does not contain minimum number of alphas')
# Max Repeats
tryPassword(topology_st.standalone, 'passwordMaxRepeats', 2, 0, 'passsword',
'password123', 'too many repeating characters')
# Min Specials
tryPassword(topology_st.standalone, 'passwordMinSpecials', 2, 0, 'passwd',
'password_#$',
'does not contain minimum number of special characters')
# Min Lowers
tryPassword(topology_st.standalone, 'passwordMinLowers', 2, 0, 'PASSWORD123',
'password123',
'does not contain minimum number of lowercase characters')
# Min Uppers
tryPassword(topology_st.standalone, 'passwordMinUppers', 2, 0, 'password',
'PASSWORD',
'does not contain minimum number of lowercase characters')
# Min 8-bits - "ldap" package only accepts ascii strings at the moment

if ds_is_newer('1.4.0.13'):
# Dictionary check
tryPassword(topology_st.standalone, 'passwordDictCheck', 'on', 'on', 'PASSWORD',
'13_#Kad472h', 'Password found in dictionary')

# Palindromes
tryPassword(topology_st.standalone, 'passwordPalindrome', 'on', 'on', 'Za12_#_21aZ',
'13_#Kad472h', 'Password is palindrome')

# Sequences
tryPassword(topology_st.standalone, 'passwordMaxSequence', 3, 0, 'Za1_1234',
'13_#Kad472h', 'Max monotonic sequence is not allowed')
tryPassword(topology_st.standalone, 'passwordMaxSequence', 3, 0, 'Za1_4321',
'13_#Kad472h', 'Max monotonic sequence is not allowed')
tryPassword(topology_st.standalone, 'passwordMaxSequence', 3, 0, 'Za1_abcd',
'13_#Kad472h', 'Max monotonic sequence is not allowed')
tryPassword(topology_st.standalone, 'passwordMaxSequence', 3, 0, 'Za1_dcba',
'13_#Kad472h', 'Max monotonic sequence is not allowed')

# Sequence Sets
> tryPassword(topology_st.standalone, 'passwordMaxSeqSets', 2, 0, 'Za1_123--123',
'13_#Kad472h', 'Max monotonic sequence is not allowed')

suites/password/pwdPolicy_syntax_test.py:265:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

inst = <lib389.DirSrv object at 0x7fa5f667aee0>
policy_attr = 'passwordMaxSeqSets', value = 2, reset_value = 0
pw_bad = 'Za1_123--123', pw_good = '13_#Kad472h'
msg = 'Max monotonic sequence is not allowed'

def tryPassword(inst, policy_attr, value, reset_value, pw_bad, pw_good, msg):
"""Attempt to change the users password
inst: DirSrv Object
password: password
msg - error message if failure
"""

setPolicy(inst, policy_attr, value)
inst.simple_bind_s(USER_DN, PASSWORD)
users = UserAccounts(inst, DEFAULT_SUFFIX)
user = users.get(USER_RDN)
try:
user.reset_password(pw_bad)
log.fatal('Invalid password was unexpectedly accepted (%s)' %
(policy_attr))
> assert False
E assert False

suites/password/pwdPolicy_syntax_test.py:96: AssertionError
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created. INFO  tests.suites.password.pwdPolicy_syntax_test:pwdPolicy_syntax_test.py:33 Enable global password policy. Check for syntax.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdPolicy_syntax_test:pwdPolicy_syntax_test.py:98 Invalid password correctly rejected by passwordMinLength: length too short INFO  tests.suites.password.pwdPolicy_syntax_test:pwdPolicy_syntax_test.py:98 Invalid password correctly rejected by passwordMinDigits: does not contain minimum number of digits INFO  tests.suites.password.pwdPolicy_syntax_test:pwdPolicy_syntax_test.py:98 Invalid password correctly rejected by passwordMinAlphas: does not contain minimum number of alphas INFO  tests.suites.password.pwdPolicy_syntax_test:pwdPolicy_syntax_test.py:98 Invalid password correctly rejected by passwordMaxRepeats: too many repeating characters INFO  tests.suites.password.pwdPolicy_syntax_test:pwdPolicy_syntax_test.py:98 Invalid password correctly rejected by passwordMinSpecials: does not contain minimum number of special characters INFO  tests.suites.password.pwdPolicy_syntax_test:pwdPolicy_syntax_test.py:98 Invalid password correctly rejected by passwordMinLowers: does not contain minimum number of lowercase characters INFO  tests.suites.password.pwdPolicy_syntax_test:pwdPolicy_syntax_test.py:98 Invalid password correctly rejected by passwordMinUppers: does not contain minimum number of lowercase characters INFO  tests.suites.password.pwdPolicy_syntax_test:pwdPolicy_syntax_test.py:98 Invalid password correctly rejected by passwordDictCheck: Password found in dictionary INFO  tests.suites.password.pwdPolicy_syntax_test:pwdPolicy_syntax_test.py:98 Invalid password correctly rejected by passwordPalindrome: Password is palindrome INFO  tests.suites.password.pwdPolicy_syntax_test:pwdPolicy_syntax_test.py:98 Invalid password correctly rejected by passwordMaxSequence: Max monotonic sequence is not allowed INFO  tests.suites.password.pwdPolicy_syntax_test:pwdPolicy_syntax_test.py:98 Invalid password correctly rejected by passwordMaxSequence: Max monotonic sequence is not allowed INFO  tests.suites.password.pwdPolicy_syntax_test:pwdPolicy_syntax_test.py:98 Invalid password correctly rejected by passwordMaxSequence: Max monotonic sequence is not allowed INFO  tests.suites.password.pwdPolicy_syntax_test:pwdPolicy_syntax_test.py:98 Invalid password correctly rejected by passwordMaxSequence: Max monotonic sequence is not allowed CRITICAL tests.suites.password.pwdPolicy_syntax_test:pwdPolicy_syntax_test.py:94 Invalid password was unexpectedly accepted (passwordMaxSeqSets)
Failed suites/password/pwdPolicy_syntax_test.py::test_config_set_few_user_attributes 0.00
topology_st = <lib389.topologies.TopologyMain object at 0x7fa5f667a2b0>
create_user = None, password_policy = None

@pytest.mark.bz1816857
@pytest.mark.ds50875
@pytest.mark.skipif(ds_is_older("1.4.1.18"), reason="Not implemented")
def test_config_set_few_user_attributes(topology_st, create_user, password_policy):
"""Test that we can successfully set multiple values to passwordUserAttributes

:id: 188e0aee-6e29-4857-910c-27d5606f8c08
:setup: Standalone instance
:steps:
1. Set passwordUserAttributes to "description loginShell"
2. Verify passwordUserAttributes has the values
3. Verify passwordUserAttributes enforced the policy
:expectedresults:
1. Operation should be successful
2. Operation should be successful
3. Operation should be successful
"""

standalone = topology_st.standalone

standalone.log.info('Set passwordUserAttributes to "description loginShell"')
> standalone.config.set('passwordUserAttributes', 'description loginshell')

suites/password/pwdPolicy_syntax_test.py:308:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/_mapped_object.py:446: in set
return self._instance.modify_ext_s(self._dn, [(action, key, value)],
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:613: in modify_ext_s
resp_type, resp_data, resp_msgid, resp_ctrls = self.result3(msgid,all=1,timeout=self.timeout)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:764: in result3
resp_type, resp_data, resp_msgid, decoded_resp_ctrls, retoid, retval = self.result4(
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:774: in result4
ldap_result = self._ldap_call(self._l.result4,msgid,all,timeout,add_ctrls,add_intermediates,add_extop)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:340: in _ldap_call
reraise(exc_type, exc_value, exc_traceback)
/usr/local/lib64/python3.8/site-packages/ldap/compat.py:46: in reraise
raise exc_value
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7fa5f667aee0>
func = <built-in method result4 of LDAP object at 0x7fa5f6509360>
args = (240, 1, -1, 0, 0, 0), kwargs = {}, diagnostic_message_success = None
exc_type = None, exc_value = None, exc_traceback = None

def _ldap_call(self,func,*args,**kwargs):
"""
Wrapper method mainly for serializing calls into OpenLDAP libs
and trace logs
"""
self._ldap_object_lock.acquire()
if __debug__:
if self._trace_level>=1:
self._trace_file.write('*** %s %s - %s\n%s\n' % (
repr(self),
self._uri,
'.'.join((self.__class__.__name__,func.__name__)),
pprint.pformat((args,kwargs))
))
if self._trace_level>=9:
traceback.print_stack(limit=self._trace_stack_limit,file=self._trace_file)
diagnostic_message_success = None
try:
try:
> result = func(*args,**kwargs)
E ldap.INSUFFICIENT_ACCESS: {'msgtype': 103, 'msgid': 240, 'result': 50, 'desc': 'Insufficient access', 'ctrls': [], 'info': "Insufficient 'write' privilege to the 'passwordUserAttributes' attribute of entry 'cn=config'.\n"}

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:324: INSUFFICIENT_ACCESS
-------------------------------Captured log call--------------------------------
INFO  lib389:pwdPolicy_syntax_test.py:307 Set passwordUserAttributes to "description loginShell"
Failed suites/password/pwdPolicy_syntax_test.py::test_config_set_few_bad_words 0.01
topology_st = <lib389.topologies.TopologyMain object at 0x7fa5f667a2b0>
create_user = None, password_policy = None

@pytest.mark.bz1816857
@pytest.mark.ds50875
@pytest.mark.skipif(ds_is_older("1.4.1.18"), reason="Not implemented")
def test_config_set_few_bad_words(topology_st, create_user, password_policy):
"""Test that we can successfully set multiple values to passwordBadWords

:id: 2977094c-921c-4b2f-af91-4c7a45ded48b
:setup: Standalone instance
:steps:
1. Set passwordBadWords to "fedora redhat"
2. Verify passwordBadWords has the values
3. Verify passwordBadWords enforced the policy
:expectedresults:
1. Operation should be successful
2. Operation should be successful
3. Operation should be successful
"""

standalone = topology_st.standalone

standalone.log.info('Set passwordBadWords to "fedora redhat"')
> standalone.config.set('passwordBadWords', 'fedora redhat')

suites/password/pwdPolicy_syntax_test.py:349:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/_mapped_object.py:446: in set
return self._instance.modify_ext_s(self._dn, [(action, key, value)],
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:613: in modify_ext_s
resp_type, resp_data, resp_msgid, resp_ctrls = self.result3(msgid,all=1,timeout=self.timeout)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:764: in result3
resp_type, resp_data, resp_msgid, decoded_resp_ctrls, retoid, retval = self.result4(
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:774: in result4
ldap_result = self._ldap_call(self._l.result4,msgid,all,timeout,add_ctrls,add_intermediates,add_extop)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:340: in _ldap_call
reraise(exc_type, exc_value, exc_traceback)
/usr/local/lib64/python3.8/site-packages/ldap/compat.py:46: in reraise
raise exc_value
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7fa5f667aee0>
func = <built-in method result4 of LDAP object at 0x7fa5f6509360>
args = (241, 1, -1, 0, 0, 0), kwargs = {}, diagnostic_message_success = None
exc_type = None, exc_value = None, exc_traceback = None

def _ldap_call(self,func,*args,**kwargs):
"""
Wrapper method mainly for serializing calls into OpenLDAP libs
and trace logs
"""
self._ldap_object_lock.acquire()
if __debug__:
if self._trace_level>=1:
self._trace_file.write('*** %s %s - %s\n%s\n' % (
repr(self),
self._uri,
'.'.join((self.__class__.__name__,func.__name__)),
pprint.pformat((args,kwargs))
))
if self._trace_level>=9:
traceback.print_stack(limit=self._trace_stack_limit,file=self._trace_file)
diagnostic_message_success = None
try:
try:
> result = func(*args,**kwargs)
E ldap.INSUFFICIENT_ACCESS: {'msgtype': 103, 'msgid': 241, 'result': 50, 'desc': 'Insufficient access', 'ctrls': [], 'info': "Insufficient 'write' privilege to the 'passwordBadWords' attribute of entry 'cn=config'.\n"}

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:324: INSUFFICIENT_ACCESS
-------------------------------Captured log call--------------------------------
INFO  lib389:pwdPolicy_syntax_test.py:348 Set passwordBadWords to "fedora redhat"
Failed suites/password/regression_test.py::test_unhashed_pw_switch 10.65
topo_master = <lib389.topologies.TopologyMain object at 0x7fa5f5b7d0d0>

@pytest.mark.ds49789
def test_unhashed_pw_switch(topo_master):
"""Check that nsslapd-unhashed-pw-switch works corrently

:id: e5aba180-d174-424d-92b0-14fe7bb0b92a
:setup: Master Instance
:steps:
1. A Master is created, enable retrocl (not used here)
2. Create a set of users
3. update userpassword of user1 and check that unhashed#user#password is not logged (default)
4. udpate userpassword of user2 and check that unhashed#user#password is not logged ('nolog')
5. udpate userpassword of user3 and check that unhashed#user#password is logged ('on')
:expectedresults:
1. Success
2. Success
3. Success (unhashed#user#password is not logged in the replication changelog)
4. Success (unhashed#user#password is not logged in the replication changelog)
5. Success (unhashed#user#password is logged in the replication changelog)
"""
MAX_USERS = 10
PEOPLE_DN = ("ou=people," + DEFAULT_SUFFIX)

inst = topo_master.ms["master1"]
inst.modify_s("cn=Retro Changelog Plugin,cn=plugins,cn=config",
[(ldap.MOD_REPLACE, 'nsslapd-changelogmaxage', b'2m'),
(ldap.MOD_REPLACE, 'nsslapd-changelog-trim-interval', b"5s"),
(ldap.MOD_REPLACE, 'nsslapd-logAccess', b'on')])
inst.config.loglevel(vals=[256 + 4], service='access')
inst.restart()
# If you need any test suite initialization,
# please, write additional fixture for that (including finalizer).
# Topology for suites are predefined in lib389/topologies.py.

# enable dynamic plugins, memberof and retro cl plugin
#
log.info('Enable plugins...')
try:
inst.modify_s(DN_CONFIG,
[(ldap.MOD_REPLACE,
'nsslapd-dynamic-plugins',
b'on')])
except ldap.LDAPError as e:
ldap.error('Failed to enable dynamic plugins! ' + e.message['desc'])
assert False

#topology_st.standalone.plugins.enable(name=PLUGIN_MEMBER_OF)
inst.plugins.enable(name=PLUGIN_RETRO_CHANGELOG)
#topology_st.standalone.modify_s("cn=changelog,cn=ldbm database,cn=plugins,cn=config", [(ldap.MOD_REPLACE, 'nsslapd-cachememsize', str(100000))])
inst.restart()

log.info('create users and group...')
for idx in range(1, MAX_USERS):
try:
USER_DN = ("uid=member%d,%s" % (idx, PEOPLE_DN))
inst.add_s(Entry((USER_DN,
{'objectclass': 'top extensibleObject'.split(),
'uid': 'member%d' % (idx)})))
except ldap.LDAPError as e:
log.fatal('Failed to add user (%s): error %s' % (USER_DN, e.message['desc']))
assert False

# Check default is that unhashed#user#password is not logged on 1.4.1.6+
user = "uid=member1,%s" % (PEOPLE_DN)
inst.modify_s(user, [(ldap.MOD_REPLACE,
'userpassword',
PASSWORD.encode())])
inst.stop()
if ds_is_newer('1.4.1.6'):
> _check_unhashed_userpw(inst, user, is_present=False)

suites/password/regression_test.py:290:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

inst = <lib389.DirSrv object at 0x7fa5f657ea60>
user_dn = 'uid=member1,ou=people,dc=example,dc=com', is_present = False

def _check_unhashed_userpw(inst, user_dn, is_present=False):
"""Check if unhashed#user#password attribute is present or not in the changelog"""
unhashed_pwd_attribute = 'unhashed#user#password'

if ds_supports_new_changelog():
dbscanOut = inst.dbscan(DEFAULT_BENAME, 'changelog')
else:
changelog_dbdir = os.path.join(os.path.dirname(inst.dbdir), DEFAULT_CHANGELOG_DB)
> for dbfile in os.listdir(changelog_dbdir):
E FileNotFoundError: [Errno 2] No such file or directory: '/var/lib/dirsrv/slapd-master1/changelogdb'

suites/password/regression_test.py:50: FileNotFoundError
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:257 Enable plugins... INFO  tests.suites.password.regression_test:regression_test.py:272 create users and group...
Failed suites/setup_ds/dscreate_test.py::test_setup_ds_minimal 10.51
topology = <tests.suites.setup_ds.dscreate_test.TopologyInstance object at 0x7fa5f4a55d60>

def test_setup_ds_minimal(topology):
# Create the setupDs
lc = LogCapture()
# Give it the right types.
sds = SetupDs(verbose=DEBUGGING, dryrun=False, log=lc.log)

# Get the dicts from Type2Base, as though they were from _validate_ds_2_config
# IE get the defaults back just from Slapd2Base.collect
# Override instance name, root password, port and secure port.

general_options = General2Base(lc.log)
general_options.verify()
general = general_options.collect()

slapd_options = Slapd2Base(lc.log)
slapd_options.set('instance_name', INSTANCE_SERVERID)
slapd_options.set('port', INSTANCE_PORT)
slapd_options.set('root_password', PW_DM)
slapd_options.verify()
slapd = slapd_options.collect()

> sds.create_from_args(general, slapd, {}, None)

suites/setup_ds/dscreate_test.py:109:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/instance/setup.py:674: in create_from_args
self._install_ds(general, slapd, backends)
/usr/local/lib/python3.8/site-packages/lib389/instance/setup.py:910: in _install_ds
ds_instance.start(timeout=60)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:1179: in start
self.open()
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:1067: in open
self.sasl_interactive_bind_s("", sasl_auth, escapehatch='i am sure')
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:476: in sasl_interactive_bind_s
return self._ldap_call(self._l.sasl_interactive_bind_s,who,auth,RequestControlTuples(serverctrls),RequestControlTuples(clientctrls),sasl_flags)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:340: in _ldap_call
reraise(exc_type, exc_value, exc_traceback)
/usr/local/lib64/python3.8/site-packages/ldap/compat.py:46: in reraise
raise exc_value
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7fa5f4e6c520>
func = <built-in method sasl_interactive_bind_s of LDAP object at 0x7fa5f4c37300>
args = ('', <ldap.sasl.external object at 0x7fa5f4c373a0>, None, None, 2)
kwargs = {}, diagnostic_message_success = None, exc_type = None
exc_value = None, exc_traceback = None

def _ldap_call(self,func,*args,**kwargs):
"""
Wrapper method mainly for serializing calls into OpenLDAP libs
and trace logs
"""
self._ldap_object_lock.acquire()
if __debug__:
if self._trace_level>=1:
self._trace_file.write('*** %s %s - %s\n%s\n' % (
repr(self),
self._uri,
'.'.join((self.__class__.__name__,func.__name__)),
pprint.pformat((args,kwargs))
))
if self._trace_level>=9:
traceback.print_stack(limit=self._trace_stack_limit,file=self._trace_file)
diagnostic_message_success = None
try:
try:
> result = func(*args,**kwargs)
E ldap.SERVER_DOWN: {'result': -1, 'desc': "Can't contact LDAP server", 'errno': 2, 'ctrls': [], 'info': 'No such file or directory'}

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:324: SERVER_DOWN
-------------------------------Captured log setup-------------------------------
DEBUG  lib389:dscreate_test.py:42 Instance allocated DEBUG  lib389:__init__.py:554 Allocate <class 'lib389.DirSrv'> with None DEBUG  lib389:__init__.py:577 Allocate <class 'lib389.DirSrv'> with ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:54321 DEBUG  lib389:__init__.py:602 Allocate <class 'lib389.DirSrv'> with ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:54321 DEBUG  lib389:__init__.py:742 list instance not found in /etc/dirsrv/slapd-standalone/dse.ldif: standalone
-------------------------------Captured log call--------------------------------
DEBUG  lib389:__init__.py:742 list instance not found in /etc/dirsrv/slapd-standalone/dse.ldif: standalone DEBUG  lib389:__init__.py:554 Allocate <class 'lib389.DirSrv'> with None DEBUG  lib389:__init__.py:566 Allocate <class 'lib389.DirSrv'> with /var/run/slapd-standalone.socket DEBUG  lib389:__init__.py:577 Allocate <class 'lib389.DirSrv'> with localhost:54321 DEBUG  lib389:__init__.py:602 Allocate <class 'lib389.DirSrv'> with localhost:54321 DEBUG  lib389:nss_ssl.py:197 nss cmd: /usr/bin/certutil -N -d /etc/dirsrv/slapd-standalone -f /etc/dirsrv/slapd-standalone/pwdfile.txt DEBUG  lib389:nss_ssl.py:199 nss output: DEBUG  lib389.nss_ssl:nss_ssl.py:197 nss cmd: /usr/bin/certutil -N -d /etc/dirsrv/ssca/ -f /etc/dirsrv/ssca//pwdfile.txt DEBUG  lib389.nss_ssl:nss_ssl.py:199 nss output: DEBUG  lib389.nss_ssl:nss_ssl.py:280 nss cmd: /usr/bin/certutil -S -n Self-Signed-CA -s CN=ssca.389ds.example.com,O=testing,L=389ds,ST=Queensland,C=AU -x -g 4096 -t CT,, -v 24 -2 --keyUsage certSigning -d /etc/dirsrv/ssca/ -z /etc/dirsrv/ssca//noise.txt -f /etc/dirsrv/ssca//pwdfile.txt DEBUG  lib389.nss_ssl:nss_ssl.py:282 nss output: Is this a CA certificate [y/N]? Enter the path length constraint, enter to skip [<0 for unlimited path]: > Is this a critical extension [y/N]? DEBUG  lib389.nss_ssl:nss_ssl.py:294 nss cmd: /usr/bin/certutil -L -n Self-Signed-CA -d /etc/dirsrv/ssca/ -a DEBUG  lib389.nss_ssl:nss_ssl.py:242 nss cmd: /usr/bin/openssl rehash /etc/dirsrv/ssca/ DEBUG  lib389:nss_ssl.py:559 CSR subject -> CN=ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com,givenName=b45f349a-5079-43ed-bb7f-c7978772d4d5,O=testing,L=389ds,ST=Queensland,C=AU DEBUG  lib389:nss_ssl.py:560 CSR alt_names -> ['ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com'] DEBUG  lib389:nss_ssl.py:592 nss cmd: /usr/bin/certutil -R --keyUsage digitalSignature,nonRepudiation,keyEncipherment,dataEncipherment --nsCertType sslClient,sslServer --extKeyUsage clientAuth,serverAuth -s CN=ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com,givenName=b45f349a-5079-43ed-bb7f-c7978772d4d5,O=testing,L=389ds,ST=Queensland,C=AU -8 ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com -g 4096 -d /etc/dirsrv/slapd-standalone -z /etc/dirsrv/slapd-standalone/noise.txt -f /etc/dirsrv/slapd-standalone/pwdfile.txt -a -o /etc/dirsrv/slapd-standalone/Server-Cert.csr DEBUG  lib389.nss_ssl:nss_ssl.py:618 nss cmd: /usr/bin/certutil -C -d /etc/dirsrv/ssca/ -f /etc/dirsrv/ssca//pwdfile.txt -v 24 -a -i /etc/dirsrv/slapd-standalone/Server-Cert.csr -o /etc/dirsrv/slapd-standalone/Server-Cert.crt -c Self-Signed-CA DEBUG  lib389:nss_ssl.py:242 nss cmd: /usr/bin/openssl rehash /etc/dirsrv/slapd-standalone DEBUG  lib389:nss_ssl.py:646 nss cmd: /usr/bin/certutil -A -n Self-Signed-CA -t CT,, -a -i /etc/dirsrv/slapd-standalone/ca.crt -d /etc/dirsrv/slapd-standalone -f /etc/dirsrv/slapd-standalone/pwdfile.txt DEBUG  lib389:nss_ssl.py:661 nss cmd: /usr/bin/certutil -A -n Server-Cert -t ,, -a -i /etc/dirsrv/slapd-standalone/Server-Cert.crt -d /etc/dirsrv/slapd-standalone -f /etc/dirsrv/slapd-standalone/pwdfile.txt DEBUG  lib389:nss_ssl.py:670 nss cmd: /usr/bin/certutil -V -d /etc/dirsrv/slapd-standalone -n Server-Cert -u YCV DEBUG  lib389.utils:utils.py:284 port 636 already in [389, 636, 3268, 3269, 7389], skipping port relabel DEBUG  lib389.utils:utils.py:315 CMD: semanage port -a -t ldap_port_t -p tcp 54321 ; STDOUT: ; STDERR: DEBUG  lib389:__init__.py:1229 systemd status -> True DEBUG  lib389:__init__.py:1133 systemd status -> True DEBUG  lib389:__init__.py:993 open(): Connecting to uri ldapi://%2Fvar%2Frun%2Fslapd-standalone.socket DEBUG  lib389:__init__.py:1001 Using dirsrv ca certificate /etc/dirsrv/slapd-standalone DEBUG  lib389:__init__.py:1010 Using external ca certificate /etc/dirsrv/slapd-standalone DEBUG  lib389:__init__.py:1023 Using external ca certificate /etc/dirsrv/slapd-standalone DEBUG  lib389:__init__.py:1031 Using certificate policy 1 DEBUG  lib389:__init__.py:1032 ldap.OPT_X_TLS_REQUIRE_CERT = 1 DEBUG  lib389:__init__.py:1065 open(): Using root autobind ...
Failed suites/syncrepl_plugin/basic_test.py::test_sync_repl_mep 22.93
topology = <lib389.topologies.TopologyMain object at 0x7fa5f947c190>
request = <FixtureRequest for <Function test_sync_repl_mep>>

def test_sync_repl_mep(topology, request):
"""Test sync repl with MEP plugin that triggers several
updates on the same entry

:id: d9515930-293e-42da-9835-9f255fa6111b
:setup: Standalone Instance
:steps:
1. enable retro/sync_repl/mep
2. Add mep Template and definition entry
3. start sync_repl client
4. Add users with PosixAccount ObjectClass (mep will update it several times)
5. Check that the received cookie are progressing
:expected results:
1. Success
2. Success
3. Success
4. Success
5. Success
"""
inst = topology[0]

# Enable/configure retroCL
plugin = RetroChangelogPlugin(inst)
plugin.disable()
plugin.enable()
plugin.set('nsslapd-attribute', 'nsuniqueid:targetuniqueid')

# Enable sync plugin
plugin = ContentSyncPlugin(inst)
plugin.enable()

# Check the plug-in status
mana = ManagedEntriesPlugin(inst)
plugin.enable()

# Add Template and definition entry
org1 = OrganizationalUnits(inst, DEFAULT_SUFFIX).create(properties={'ou': 'Users'})
org2 = OrganizationalUnit(inst, f'ou=Groups,{DEFAULT_SUFFIX}')
meps = MEPTemplates(inst, DEFAULT_SUFFIX)
mep_template1 = meps.create(properties={
'cn': 'UPG Template1',
'mepRDNAttr': 'cn',
'mepStaticAttr': 'objectclass: posixGroup',
'mepMappedAttr': 'cn: $uid|gidNumber: $gidNumber|description: User private group for $uid'.split('|')})
conf_mep = MEPConfigs(inst)
mep_config = conf_mep.create(properties={
'cn': 'UPG Definition2',
'originScope': org1.dn,
'originFilter': 'objectclass=posixaccount',
'managedBase': org2.dn,
'managedTemplate': mep_template1.dn})

# Enable plugin log level (usefull for debug)
inst.setLogLevel(65536)
inst.restart()

# create a sync repl client and wait 5 seconds to be sure it is running
sync_repl = Sync_persist(inst)
sync_repl.start()
time.sleep(5)

# Add users with PosixAccount ObjectClass and verify creation of User Private Group
user = UserAccounts(inst, f'ou=Users,{DEFAULT_SUFFIX}', rdn=None).create_test_user()
assert user.get_attr_val_utf8('mepManagedEntry') == f'cn=test_user_1000,ou=Groups,{DEFAULT_SUFFIX}'

# stop the server to get the sync_repl result set (exit from while loop).
# Only way I found to acheive that.
# and wait a bit to let sync_repl thread time to set its result before fetching it.
inst.stop()
time.sleep(10)
cookies = sync_repl.get_result()

# checking that the cookie are in increasing and in an acceptable range (0..1000)
assert len(cookies) > 0
prev = 0
for cookie in cookies:
log.info('Check cookie %s' % cookie)

assert int(cookie) > 0
assert int(cookie) < 1000
> assert int(cookie) > prev
E AssertionError: assert 9 > 9
E + where 9 = int('9')

suites/syncrepl_plugin/basic_test.py:231: AssertionError
------------------------------Captured stdout call------------------------------
syncrepl_poll: LDAP error (%s) {'result': -1, 'desc': "Can't contact LDAP server", 'ctrls': []}
Failed suites/syncrepl_plugin/basic_test.py::test_sync_repl_cookie 0.00
topology = <lib389.topologies.TopologyMain object at 0x7fa5f947c190>
request = <FixtureRequest for <Function test_sync_repl_cookie>>

def test_sync_repl_cookie(topology, request):
"""Test sync_repl cookie are progressing is an increasing order
when there are nested updates

:id: d7fbde25-5702-46ac-b38e-169d7a68e97c
:setup: Standalone Instance
:steps:
1.: enable retroCL
2.: configure retroCL to log nsuniqueid as targetUniqueId
3.: enable content_sync plugin
4.: enable automember
5.: create (2) groups. Few groups can help to reproduce the concurrent updates problem.
6.: configure automember to provision those groups with 'member'
7.: enable and configure memberof plugin
8.: enable plugin log level
9.: restart the server
10.: create a thread dedicated to run a sync repl client
11.: Create (9) users that will generate nested updates (automember/memberof)
12.: stop sync repl client and collect the list of cookie.change_no
13.: check that cookies.change_no are in increasing order
:expectedresults:
1.: succeeds
2.: succeeds
3.: succeeds
4.: succeeds
5.: succeeds
6.: succeeds
7.: succeeds
8.: succeeds
9.: succeeds
10.: succeeds
11.: succeeds
12.: succeeds
13.: succeeds
"""
inst = topology[0]

# Enable/configure retroCL
plugin = RetroChangelogPlugin(inst)
> plugin.disable()

suites/syncrepl_plugin/basic_test.py:275:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/plugins.py:63: in disable
self.set('nsslapd-pluginEnabled', 'off')
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.plugins.RetroChangelogPlugin object at 0x7fa5f4a6ea00>
key = 'nsslapd-pluginEnabled', value = 'off', action = 2

def set(self, key, value, action=ldap.MOD_REPLACE):
"""Perform a specified action on a key with value

:param key: an attribute name
:type key: str
:param value: an attribute value
:type value: str
:param action: - ldap.MOD_REPLACE - by default
- ldap.MOD_ADD
- ldap.MOD_DELETE
:type action: int

:returns: result of modify_s operation
:raises: ValueError - if instance is not online
"""

if action == ldap.MOD_ADD:
action_txt = "ADD"
elif action == ldap.MOD_REPLACE:
action_txt = "REPLACE"
elif action == ldap.MOD_DELETE:
action_txt = "DELETE"
else:
# This should never happen (bug!)
action_txt = "UNKNOWN"

if value is None or len(value) < 512:
self._log.debug("%s set %s: (%r, %r)" % (self._dn, action_txt, key, display_log_value(key, value)))
else:
self._log.debug("%s set %s: (%r, value too large)" % (self._dn, action_txt, key))
if self._instance.state != DIRSRV_STATE_ONLINE:
> raise ValueError("Invalid state. Cannot set properties on instance that is not ONLINE.")
E ValueError: Invalid state. Cannot set properties on instance that is not ONLINE.

/usr/local/lib/python3.8/site-packages/lib389/_mapped_object.py:438: ValueError
Failed suites/syncrepl_plugin/basic_test.py::test_sync_repl_cookie_add_del 0.00
topology = <lib389.topologies.TopologyMain object at 0x7fa5f947c190>
request = <FixtureRequest for <Function test_sync_repl_cookie_add_del>>

def test_sync_repl_cookie_add_del(topology, request):
"""Test sync_repl cookie are progressing is an increasing order
when there add and del

:id: 83e11038-6ed0-4a5b-ac77-e44887ab11e3
:setup: Standalone Instance
:steps:
1.: enable retroCL
2.: configure retroCL to log nsuniqueid as targetUniqueId
3.: enable content_sync plugin
4.: enable automember
5.: create (2) groups. Few groups can help to reproduce the concurrent updates problem.
6.: configure automember to provision those groups with 'member'
7.: enable and configure memberof plugin
8.: enable plugin log level
9.: restart the server
10.: create a thread dedicated to run a sync repl client
11.: Create (3) users that will generate nested updates (automember/memberof)
12.: Delete (3) users
13.: stop sync repl client and collect the list of cookie.change_no
14.: check that cookies.change_no are in increasing order
:expectedresults:
1.: succeeds
2.: succeeds
3.: succeeds
4.: succeeds
5.: succeeds
6.: succeeds
7.: succeeds
8.: succeeds
9.: succeeds
10.: succeeds
11.: succeeds
12.: succeeds
13.: succeeds
14.: succeeds
"""
inst = topology[0]

# Enable/configure retroCL
plugin = RetroChangelogPlugin(inst)
> plugin.disable()

suites/syncrepl_plugin/basic_test.py:407:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/plugins.py:63: in disable
self.set('nsslapd-pluginEnabled', 'off')
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.plugins.RetroChangelogPlugin object at 0x7fa5f513feb0>
key = 'nsslapd-pluginEnabled', value = 'off', action = 2

def set(self, key, value, action=ldap.MOD_REPLACE):
"""Perform a specified action on a key with value

:param key: an attribute name
:type key: str
:param value: an attribute value
:type value: str
:param action: - ldap.MOD_REPLACE - by default
- ldap.MOD_ADD
- ldap.MOD_DELETE
:type action: int

:returns: result of modify_s operation
:raises: ValueError - if instance is not online
"""

if action == ldap.MOD_ADD:
action_txt = "ADD"
elif action == ldap.MOD_REPLACE:
action_txt = "REPLACE"
elif action == ldap.MOD_DELETE:
action_txt = "DELETE"
else:
# This should never happen (bug!)
action_txt = "UNKNOWN"

if value is None or len(value) < 512:
self._log.debug("%s set %s: (%r, %r)" % (self._dn, action_txt, key, display_log_value(key, value)))
else:
self._log.debug("%s set %s: (%r, value too large)" % (self._dn, action_txt, key))
if self._instance.state != DIRSRV_STATE_ONLINE:
> raise ValueError("Invalid state. Cannot set properties on instance that is not ONLINE.")
E ValueError: Invalid state. Cannot set properties on instance that is not ONLINE.

/usr/local/lib/python3.8/site-packages/lib389/_mapped_object.py:438: ValueError
Failed suites/syncrepl_plugin/basic_test.py::test_sync_repl_cookie_with_failure 0.00
topology = <lib389.topologies.TopologyMain object at 0x7fa5f947c190>
request = <FixtureRequest for <Function test_sync_repl_cookie_with_failure>>

def test_sync_repl_cookie_with_failure(topology, request):
"""Test sync_repl cookie are progressing is the right order
when there is a failure in nested updates

:id: e0103448-170e-4080-8f22-c34606447ce2
:setup: Standalone Instance
:steps:
1.: enable retroCL
2.: configure retroCL to log nsuniqueid as targetUniqueId
3.: enable content_sync plugin
4.: enable automember
5.: create (4) groups.
make group2 groupOfUniqueNames so the automember
will fail to add 'member' (uniqueMember expected)
6.: configure automember to provision those groups with 'member'
7.: enable and configure memberof plugin
8.: enable plugin log level
9.: restart the server
10.: create a thread dedicated to run a sync repl client
11.: Create a group that will be the only update received by sync repl client
12.: Create (9) users that will generate nested updates (automember/memberof)
13.: stop sync repl client and collect the list of cookie.change_no
14.: check that the list of cookie.change_no contains only the group 'step 11'
:expectedresults:
1.: succeeds
2.: succeeds
3.: succeeds
4.: succeeds
5.: succeeds
6.: succeeds
7.: succeeds
8.: succeeds
9.: succeeds
10.: succeeds
11.: succeeds
12.: Fails (expected)
13.: succeeds
14.: succeeds
"""
inst = topology[0]

# Enable/configure retroCL
plugin = RetroChangelogPlugin(inst)
> plugin.disable()

suites/syncrepl_plugin/basic_test.py:539:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/plugins.py:63: in disable
self.set('nsslapd-pluginEnabled', 'off')
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.plugins.RetroChangelogPlugin object at 0x7fa5f516e610>
key = 'nsslapd-pluginEnabled', value = 'off', action = 2

def set(self, key, value, action=ldap.MOD_REPLACE):
"""Perform a specified action on a key with value

:param key: an attribute name
:type key: str
:param value: an attribute value
:type value: str
:param action: - ldap.MOD_REPLACE - by default
- ldap.MOD_ADD
- ldap.MOD_DELETE
:type action: int

:returns: result of modify_s operation
:raises: ValueError - if instance is not online
"""

if action == ldap.MOD_ADD:
action_txt = "ADD"
elif action == ldap.MOD_REPLACE:
action_txt = "REPLACE"
elif action == ldap.MOD_DELETE:
action_txt = "DELETE"
else:
# This should never happen (bug!)
action_txt = "UNKNOWN"

if value is None or len(value) < 512:
self._log.debug("%s set %s: (%r, %r)" % (self._dn, action_txt, key, display_log_value(key, value)))
else:
self._log.debug("%s set %s: (%r, value too large)" % (self._dn, action_txt, key))
if self._instance.state != DIRSRV_STATE_ONLINE:
> raise ValueError("Invalid state. Cannot set properties on instance that is not ONLINE.")
E ValueError: Invalid state. Cannot set properties on instance that is not ONLINE.

/usr/local/lib/python3.8/site-packages/lib389/_mapped_object.py:438: ValueError
Failed tickets/ticket47781_test.py::test_ticket47781 4.03
topology_st = <lib389.topologies.TopologyMain object at 0x7fa5f5212070>

def test_ticket47781(topology_st):
"""
Testing for a deadlock after doing an online import of an LDIF with
replication data. The replication agreement should be invalid.
"""

log.info('Testing Ticket 47781 - Testing for deadlock after importing LDIF with replication data')

master = topology_st.standalone
repl = ReplicationManager(DEFAULT_SUFFIX)
repl.create_first_master(master)

properties = {RA_NAME: r'meTo_$host:$port',
RA_BINDDN: defaultProperties[REPLICATION_BIND_DN],
RA_BINDPW: defaultProperties[REPLICATION_BIND_PW],
RA_METHOD: defaultProperties[REPLICATION_BIND_METHOD],
RA_TRANSPORT_PROT: defaultProperties[REPLICATION_TRANSPORT]}
# The agreement should point to a server that does NOT exist (invalid port)
repl_agreement = master.agreement.create(suffix=DEFAULT_SUFFIX,
host=master.host,
port=5555,
properties=properties)

#
# add two entries
#
log.info('Adding two entries...')

master.add_s(Entry(('cn=entry1,dc=example,dc=com', {
'objectclass': 'top person'.split(),
'sn': 'user',
'cn': 'entry1'})))

master.add_s(Entry(('cn=entry2,dc=example,dc=com', {
'objectclass': 'top person'.split(),
'sn': 'user',
'cn': 'entry2'})))

#
# export the replication ldif
#
log.info('Exporting replication ldif...')
args = {EXPORT_REPL_INFO: True}
exportTask = Tasks(master)
exportTask.exportLDIF(DEFAULT_SUFFIX, None, "/tmp/export.ldif", args)

#
# Restart the server
#
log.info('Restarting server...')
master.stop()
master.start()

#
# Import the ldif
#
log.info('Import replication LDIF file...')
importTask = Tasks(master)
args = {TASK_WAIT: True}
> importTask.importLDIF(DEFAULT_SUFFIX, None, "/tmp/export.ldif", args)

tickets/ticket47781_test.py:85:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.tasks.Tasks object at 0x7fa5f47fe790>
suffix = 'dc=example,dc=com', benamebase = None, input_file = '/tmp/export.ldif'
args = {'wait': True}

def importLDIF(self, suffix=None, benamebase=None, input_file=None,
args=None):
'''
Import from a LDIF format a given 'suffix' (or 'benamebase' that stores
that suffix). It uses an internal task to acheive this request.

If 'suffix' and 'benamebase' are specified, it uses 'benamebase' first
else 'suffix'.
If both 'suffix' and 'benamebase' are missing it raise ValueError

'input_file' is the ldif input file

@param suffix - suffix of the backend
@param benamebase - 'commonname'/'cn' of the backend (e.g. 'userRoot')
@param ldif_input - file that will contain the entries in LDIF format
to import
@param args - is a dictionary that contains modifier of the import task
wait: True/[False] - If True, 'export' waits for the completion
of the task before to return

@return None

@raise ValueError

'''
if self.conn.state != DIRSRV_STATE_ONLINE:
raise ValueError("Invalid Server State %s! Must be online" % self.conn.state)

# Checking the parameters
if not benamebase and not suffix:
raise ValueError("Specify either bename or suffix")

if not input_file:
raise ValueError("input_file is mandatory")

if not os.path.exists(input_file):
> raise ValueError("Import file (%s) does not exist" % input_file)
E ValueError: Import file (/tmp/export.ldif) does not exist

/usr/local/lib/python3.8/site-packages/lib389/tasks.py:473: ValueError
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  lib389:tasks.py:567 Export task export_10202020_211439 for file /tmp/export.ldif completed successfully
Failed tickets/ticket47973_test.py::test_ticket47973_case 5.07
topology_st = <lib389.topologies.TopologyMain object at 0x7fa5f49d6cd0>

def test_ticket47973_case(topology_st):
log.info('Testing Ticket 47973 (case) - Test the cases in the original schema are preserved.')

log.info('case 1 - Test the cases in the original schema are preserved.')

tsfile = topology_st.standalone.schemadir + '/98test.ldif'
tsfd = open(tsfile, "w")
Mozattr0 = "MoZiLLaaTTRiBuTe"
testschema = "dn: cn=schema\nattributetypes: ( 8.9.10.11.12.13.14 NAME '" + Mozattr0 + "' SYNTAX 1.3.6.1.4.1.1466.115.121.1.15 X-ORIGIN 'Mozilla Dummy Schema' )\nobjectclasses: ( 1.2.3.4.5.6.7 NAME 'MozillaObject' SUP top MUST ( objectclass $ cn ) MAY ( " + Mozattr0 + " ) X-ORIGIN 'user defined' )"
tsfd.write(testschema)
tsfd.close()

try:
# run the schema reload task with the default schemadir
topology_st.standalone.tasks.schemaReload(schemadir=topology_st.standalone.schemadir,
args={TASK_WAIT: False})
except ValueError:
log.error('Schema Reload task failed.')
assert False

time.sleep(5)

try:
schemaentry = topology_st.standalone.search_s("cn=schema", ldap.SCOPE_BASE,
'objectclass=top',
["objectclasses"])
oclist = schemaentry[0].data.get("objectclasses")
except ldap.LDAPError as e:
log.error('Failed to get schema entry: error (%s)' % e.args[0]['desc'])
raise e

found = 0
for oc in oclist:
log.info('OC: %s' % oc)
moz = re.findall(Mozattr0, oc.decode('utf-8'))
if moz:
found = 1
log.info('case 1: %s is in the objectclasses list -- PASS' % Mozattr0)

if found == 0:
log.error('case 1: %s is not in the objectclasses list -- FAILURE' % Mozattr0)
> assert False
E assert False

tickets/ticket47973_test.py:156: AssertionError
-------------------------------Captured log call--------------------------------
INFO  lib389:tasks.py:1169 Schema Reload task (task-10202020_212321) completed successfully ERROR  tests.tickets.ticket47973_test:ticket47973_test.py:155 case 1: MoZiLLaaTTRiBuTe is not in the objectclasses list -- FAILURE
Failed tickets/ticket48005_test.py::test_ticket48005_usn 4.48
topology_st = <lib389.topologies.TopologyMain object at 0x7fa5f4188040>

def test_ticket48005_usn(topology_st):
'''
Enable entryusn
Delete all user entries.
Run USN tombstone cleanup task
Shutdown the server
Check if a core file was generated or not
If no core was found, this test case was successful.
'''
log.info("Ticket 48005 usn test...")
topology_st.standalone.plugins.enable(name=PLUGIN_USN)

topology_st.standalone.restart(timeout=10)

try:
> entries = topology_st.standalone.search_s(SUFFIX, ldap.SCOPE_SUBTREE, "(objectclass=inetorgperson)")

tickets/ticket48005_test.py:283:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:870: in search_s
return self.search_ext_s(base,scope,filterstr,attrlist,attrsonly,None,None,timeout=self.timeout)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:864: in search_ext_s
return self.result(msgid,all=1,timeout=timeout)[1]
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:148: in inner
objtype, data = f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:756: in result
resp_type, resp_data, resp_msgid = self.result2(msgid,all,timeout)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:760: in result2
resp_type, resp_data, resp_msgid, resp_ctrls = self.result3(msgid,all,timeout)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:764: in result3
resp_type, resp_data, resp_msgid, decoded_resp_ctrls, retoid, retval = self.result4(
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:774: in result4
ldap_result = self._ldap_call(self._l.result4,msgid,all,timeout,add_ctrls,add_intermediates,add_extop)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:340: in _ldap_call
reraise(exc_type, exc_value, exc_traceback)
/usr/local/lib64/python3.8/site-packages/ldap/compat.py:46: in reraise
raise exc_value
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7fa5f4188130>
func = <built-in method result4 of LDAP object at 0x7fa5f40622a0>
args = (3, 1, -1, 0, 0, 0), kwargs = {}, diagnostic_message_success = None
exc_type = None, exc_value = None, exc_traceback = None

def _ldap_call(self,func,*args,**kwargs):
"""
Wrapper method mainly for serializing calls into OpenLDAP libs
and trace logs
"""
self._ldap_object_lock.acquire()
if __debug__:
if self._trace_level>=1:
self._trace_file.write('*** %s %s - %s\n%s\n' % (
repr(self),
self._uri,
'.'.join((self.__class__.__name__,func.__name__)),
pprint.pformat((args,kwargs))
))
if self._trace_level>=9:
traceback.print_stack(limit=self._trace_stack_limit,file=self._trace_file)
diagnostic_message_success = None
try:
try:
> result = func(*args,**kwargs)
E ldap.NO_SUCH_OBJECT: {'msgtype': 101, 'msgid': 3, 'result': 32, 'desc': 'No such object', 'ctrls': []}

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:324: NO_SUCH_OBJECT
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48005_test:ticket48005_test.py:277 Ticket 48005 usn test...
Failed tickets/ticket48194_test.py::test_run_1 6.94
topology_st = <lib389.topologies.TopologyMain object at 0x7fa5f4148f70>

def test_run_1(topology_st):
"""
Check nsSSL3Ciphers: +all
All ciphers are enabled except null.
Note: default allowWeakCipher (i.e., off) for +all
"""
_header(topology_st, 'Test Case 2 - Check the ciphers availability for "+all" with default allowWeakCiphers')

topology_st.standalone.simple_bind_s(DN_DM, PASSWORD)
topology_st.standalone.modify_s(CONFIG_DN, [(ldap.MOD_REPLACE, 'nsslapd-errorlog-level', b'64')])
# Make sure allowWeakCipher is not set.
topology_st.standalone.modify_s(ENCRYPTION_DN, [(ldap.MOD_DELETE, 'allowWeakCipher', None)])

log.info("\n######################### Restarting the server ######################\n")
topology_st.standalone.stop(timeout=10)
os.system('mv %s %s.48194_0' % (topology_st.standalone.errlog, topology_st.standalone.errlog))
os.system('touch %s' % (topology_st.standalone.errlog))
time.sleep(2)
topology_st.standalone.start(timeout=120)

> connectWithOpenssl(topology_st, 'DES-CBC3-SHA', False)

tickets/ticket48194_test.py:158:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

topology_st = <lib389.topologies.TopologyMain object at 0x7fa5f4148f70>
cipher = 'DES-CBC3-SHA', expect = False

def connectWithOpenssl(topology_st, cipher, expect):
"""
Connect with the given cipher
Condition:
If expect is True, the handshake should be successful.
If expect is False, the handshake should be refused with
access log: "Cannot communicate securely with peer:
no common encryption algorithm(s)."
"""
log.info("Testing %s -- expect to handshake %s", cipher, "successfully" if expect else "failed")

myurl = 'localhost:%s' % LDAPSPORT
cmdline = ['/usr/bin/openssl', 's_client', '-connect', myurl, '-cipher', cipher]

strcmdline = " ".join(cmdline)
log.info("Running cmdline: %s", strcmdline)

try:
proc = subprocess.Popen(cmdline, stdout=subprocess.PIPE, stdin=subprocess.PIPE, stderr=subprocess.STDOUT)
except ValueError:
log.info("%s failed: %s", cmdline, ValueError)
proc.kill()

while True:
l = proc.stdout.readline()
if l == b"":
break
if b'Cipher is' in l:
log.info("Found: %s", l)
if expect:
if b'(NONE)' in l:
assert False
else:
proc.stdin.close()
assert True
else:
if b'(NONE)' in l:
assert True
else:
proc.stdin.close()
> assert False
E assert False

tickets/ticket48194_test.py:117: AssertionError
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket48194_test.py:40 ############################################### INFO  lib389:ticket48194_test.py:41 ####### Test Case 2 - Check the ciphers availability for "+all" with default allowWeakCiphers INFO  lib389:ticket48194_test.py:42 ############################################### INFO  lib389.utils:ticket48194_test.py:151 ######################### Restarting the server ###################### INFO  lib389.utils:ticket48194_test.py:86 Testing DES-CBC3-SHA -- expect to handshake failed INFO  lib389.utils:ticket48194_test.py:92 Running cmdline: /usr/bin/openssl s_client -connect localhost:63601 -cipher DES-CBC3-SHA INFO  lib389.utils:ticket48194_test.py:105 Found: b'New, TLSv1.3, Cipher is TLS_AES_128_GCM_SHA256\n'
Failed tickets/ticket48194_test.py::test_run_2 6.46
topology_st = <lib389.topologies.TopologyMain object at 0x7fa5f4148f70>

def test_run_2(topology_st):
"""
Check nsSSL3Ciphers: +rsa_aes_128_sha,+rsa_aes_256_sha
rsa_aes_128_sha, tls_rsa_aes_128_sha, rsa_aes_256_sha, tls_rsa_aes_256_sha are enabled.
default allowWeakCipher
"""
_header(topology_st,
'Test Case 3 - Check the ciphers availability for "+rsa_aes_128_sha,+rsa_aes_256_sha" with default allowWeakCipher')

topology_st.standalone.simple_bind_s(DN_DM, PASSWORD)
topology_st.standalone.modify_s(ENCRYPTION_DN,
[(ldap.MOD_REPLACE, 'nsSSL3Ciphers', b'+rsa_aes_128_sha,+rsa_aes_256_sha')])

log.info("\n######################### Restarting the server ######################\n")
topology_st.standalone.stop(timeout=10)
os.system('mv %s %s.48194_1' % (topology_st.standalone.errlog, topology_st.standalone.errlog))
os.system('touch %s' % (topology_st.standalone.errlog))
time.sleep(2)
topology_st.standalone.start(timeout=120)

connectWithOpenssl(topology_st, 'DES-CBC3-SHA', False)
connectWithOpenssl(topology_st, 'AES256-SHA256', False)
> connectWithOpenssl(topology_st, 'AES128-SHA', True)

tickets/ticket48194_test.py:184:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

topology_st = <lib389.topologies.TopologyMain object at 0x7fa5f4148f70>
cipher = 'AES128-SHA', expect = True

def connectWithOpenssl(topology_st, cipher, expect):
"""
Connect with the given cipher
Condition:
If expect is True, the handshake should be successful.
If expect is False, the handshake should be refused with
access log: "Cannot communicate securely with peer:
no common encryption algorithm(s)."
"""
log.info("Testing %s -- expect to handshake %s", cipher, "successfully" if expect else "failed")

myurl = 'localhost:%s' % LDAPSPORT
cmdline = ['/usr/bin/openssl', 's_client', '-connect', myurl, '-cipher', cipher]

strcmdline = " ".join(cmdline)
log.info("Running cmdline: %s", strcmdline)

try:
proc = subprocess.Popen(cmdline, stdout=subprocess.PIPE, stdin=subprocess.PIPE, stderr=subprocess.STDOUT)
except ValueError:
log.info("%s failed: %s", cmdline, ValueError)
proc.kill()

while True:
l = proc.stdout.readline()
if l == b"":
break
if b'Cipher is' in l:
log.info("Found: %s", l)
if expect:
if b'(NONE)' in l:
> assert False
E assert False

tickets/ticket48194_test.py:108: AssertionError
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket48194_test.py:40 ############################################### INFO  lib389:ticket48194_test.py:41 ####### Test Case 3 - Check the ciphers availability for "+rsa_aes_128_sha,+rsa_aes_256_sha" with default allowWeakCipher INFO  lib389:ticket48194_test.py:42 ############################################### INFO  lib389.utils:ticket48194_test.py:175 ######################### Restarting the server ###################### INFO  lib389.utils:ticket48194_test.py:86 Testing DES-CBC3-SHA -- expect to handshake failed INFO  lib389.utils:ticket48194_test.py:92 Running cmdline: /usr/bin/openssl s_client -connect localhost:63601 -cipher DES-CBC3-SHA INFO  lib389.utils:ticket48194_test.py:105 Found: b'New, (NONE), Cipher is (NONE)\n' INFO  lib389.utils:ticket48194_test.py:86 Testing AES256-SHA256 -- expect to handshake failed INFO  lib389.utils:ticket48194_test.py:92 Running cmdline: /usr/bin/openssl s_client -connect localhost:63601 -cipher AES256-SHA256 INFO  lib389.utils:ticket48194_test.py:105 Found: b'New, (NONE), Cipher is (NONE)\n' INFO  lib389.utils:ticket48194_test.py:86 Testing AES128-SHA -- expect to handshake successfully INFO  lib389.utils:ticket48194_test.py:92 Running cmdline: /usr/bin/openssl s_client -connect localhost:63601 -cipher AES128-SHA INFO  lib389.utils:ticket48194_test.py:105 Found: b'New, (NONE), Cipher is (NONE)\n'
Failed tickets/ticket48194_test.py::test_run_4 6.74
topology_st = <lib389.topologies.TopologyMain object at 0x7fa5f4148f70>

def test_run_4(topology_st):
"""
Check no nsSSL3Ciphers
Default ciphers are enabled.
default allowWeakCipher
"""
_header(topology_st, 'Test Case 5 - Check no nsSSL3Ciphers (-all) with default allowWeakCipher')

topology_st.standalone.simple_bind_s(DN_DM, PASSWORD)
topology_st.standalone.modify_s(ENCRYPTION_DN, [(ldap.MOD_DELETE, 'nsSSL3Ciphers', b'-all')])

log.info("\n######################### Restarting the server ######################\n")
topology_st.standalone.stop(timeout=10)
os.system('mv %s %s.48194_3' % (topology_st.standalone.errlog, topology_st.standalone.errlog))
os.system('touch %s' % (topology_st.standalone.errlog))
time.sleep(2)
topology_st.standalone.start(timeout=120)

> connectWithOpenssl(topology_st, 'DES-CBC3-SHA', False)

tickets/ticket48194_test.py:228:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

topology_st = <lib389.topologies.TopologyMain object at 0x7fa5f4148f70>
cipher = 'DES-CBC3-SHA', expect = False

def connectWithOpenssl(topology_st, cipher, expect):
"""
Connect with the given cipher
Condition:
If expect is True, the handshake should be successful.
If expect is False, the handshake should be refused with
access log: "Cannot communicate securely with peer:
no common encryption algorithm(s)."
"""
log.info("Testing %s -- expect to handshake %s", cipher, "successfully" if expect else "failed")

myurl = 'localhost:%s' % LDAPSPORT
cmdline = ['/usr/bin/openssl', 's_client', '-connect', myurl, '-cipher', cipher]

strcmdline = " ".join(cmdline)
log.info("Running cmdline: %s", strcmdline)

try:
proc = subprocess.Popen(cmdline, stdout=subprocess.PIPE, stdin=subprocess.PIPE, stderr=subprocess.STDOUT)
except ValueError:
log.info("%s failed: %s", cmdline, ValueError)
proc.kill()

while True:
l = proc.stdout.readline()
if l == b"":
break
if b'Cipher is' in l:
log.info("Found: %s", l)
if expect:
if b'(NONE)' in l:
assert False
else:
proc.stdin.close()
assert True
else:
if b'(NONE)' in l:
assert True
else:
proc.stdin.close()
> assert False
E assert False

tickets/ticket48194_test.py:117: AssertionError
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket48194_test.py:40 ############################################### INFO  lib389:ticket48194_test.py:41 ####### Test Case 5 - Check no nsSSL3Ciphers (-all) with default allowWeakCipher INFO  lib389:ticket48194_test.py:42 ############################################### INFO  lib389.utils:ticket48194_test.py:221 ######################### Restarting the server ###################### INFO  lib389.utils:ticket48194_test.py:86 Testing DES-CBC3-SHA -- expect to handshake failed INFO  lib389.utils:ticket48194_test.py:92 Running cmdline: /usr/bin/openssl s_client -connect localhost:63601 -cipher DES-CBC3-SHA INFO  lib389.utils:ticket48194_test.py:105 Found: b'New, TLSv1.3, Cipher is TLS_AES_128_GCM_SHA256\n'
Failed tickets/ticket48194_test.py::test_run_5 6.72
topology_st = <lib389.topologies.TopologyMain object at 0x7fa5f4148f70>

def test_run_5(topology_st):
"""
Check nsSSL3Ciphers: default
Default ciphers are enabled.
default allowWeakCipher
"""
_header(topology_st, 'Test Case 6 - Check default nsSSL3Ciphers (default setting) with default allowWeakCipher')

topology_st.standalone.simple_bind_s(DN_DM, PASSWORD)
topology_st.standalone.modify_s(ENCRYPTION_DN, [(ldap.MOD_REPLACE, 'nsSSL3Ciphers', b'default')])

log.info("\n######################### Restarting the server ######################\n")
topology_st.standalone.stop(timeout=10)
os.system('mv %s %s.48194_4' % (topology_st.standalone.errlog, topology_st.standalone.errlog))
os.system('touch %s' % (topology_st.standalone.errlog))
time.sleep(2)
topology_st.standalone.start(timeout=120)

> connectWithOpenssl(topology_st, 'DES-CBC3-SHA', False)

tickets/ticket48194_test.py:250:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

topology_st = <lib389.topologies.TopologyMain object at 0x7fa5f4148f70>
cipher = 'DES-CBC3-SHA', expect = False

def connectWithOpenssl(topology_st, cipher, expect):
"""
Connect with the given cipher
Condition:
If expect is True, the handshake should be successful.
If expect is False, the handshake should be refused with
access log: "Cannot communicate securely with peer:
no common encryption algorithm(s)."
"""
log.info("Testing %s -- expect to handshake %s", cipher, "successfully" if expect else "failed")

myurl = 'localhost:%s' % LDAPSPORT
cmdline = ['/usr/bin/openssl', 's_client', '-connect', myurl, '-cipher', cipher]

strcmdline = " ".join(cmdline)
log.info("Running cmdline: %s", strcmdline)

try:
proc = subprocess.Popen(cmdline, stdout=subprocess.PIPE, stdin=subprocess.PIPE, stderr=subprocess.STDOUT)
except ValueError:
log.info("%s failed: %s", cmdline, ValueError)
proc.kill()

while True:
l = proc.stdout.readline()
if l == b"":
break
if b'Cipher is' in l:
log.info("Found: %s", l)
if expect:
if b'(NONE)' in l:
assert False
else:
proc.stdin.close()
assert True
else:
if b'(NONE)' in l:
assert True
else:
proc.stdin.close()
> assert False
E assert False

tickets/ticket48194_test.py:117: AssertionError
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket48194_test.py:40 ############################################### INFO  lib389:ticket48194_test.py:41 ####### Test Case 6 - Check default nsSSL3Ciphers (default setting) with default allowWeakCipher INFO  lib389:ticket48194_test.py:42 ############################################### INFO  lib389.utils:ticket48194_test.py:243 ######################### Restarting the server ###################### INFO  lib389.utils:ticket48194_test.py:86 Testing DES-CBC3-SHA -- expect to handshake failed INFO  lib389.utils:ticket48194_test.py:92 Running cmdline: /usr/bin/openssl s_client -connect localhost:63601 -cipher DES-CBC3-SHA INFO  lib389.utils:ticket48194_test.py:105 Found: b'New, TLSv1.3, Cipher is TLS_AES_128_GCM_SHA256\n'
Failed tickets/ticket48194_test.py::test_run_6 6.87
topology_st = <lib389.topologies.TopologyMain object at 0x7fa5f4148f70>

def test_run_6(topology_st):
"""
Check nsSSL3Ciphers: +all,-TLS_RSA_WITH_AES_256_CBC_SHA256
All ciphers are disabled.
default allowWeakCipher
"""
_header(topology_st,
'Test Case 7 - Check nsSSL3Ciphers: +all,-TLS_RSA_WITH_AES_256_CBC_SHA256 with default allowWeakCipher')

topology_st.standalone.simple_bind_s(DN_DM, PASSWORD)
topology_st.standalone.modify_s(ENCRYPTION_DN,
[(ldap.MOD_REPLACE, 'nsSSL3Ciphers', b'+all,-TLS_RSA_WITH_AES_256_CBC_SHA256')])

log.info("\n######################### Restarting the server ######################\n")
topology_st.standalone.stop(timeout=10)
os.system('mv %s %s.48194_5' % (topology_st.standalone.errlog, topology_st.standalone.errlog))
os.system('touch %s' % (topology_st.standalone.errlog))
time.sleep(2)
topology_st.standalone.start(timeout=120)

> connectWithOpenssl(topology_st, 'DES-CBC3-SHA', False)

tickets/ticket48194_test.py:274:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

topology_st = <lib389.topologies.TopologyMain object at 0x7fa5f4148f70>
cipher = 'DES-CBC3-SHA', expect = False

def connectWithOpenssl(topology_st, cipher, expect):
"""
Connect with the given cipher
Condition:
If expect is True, the handshake should be successful.
If expect is False, the handshake should be refused with
access log: "Cannot communicate securely with peer:
no common encryption algorithm(s)."
"""
log.info("Testing %s -- expect to handshake %s", cipher, "successfully" if expect else "failed")

myurl = 'localhost:%s' % LDAPSPORT
cmdline = ['/usr/bin/openssl', 's_client', '-connect', myurl, '-cipher', cipher]

strcmdline = " ".join(cmdline)
log.info("Running cmdline: %s", strcmdline)

try:
proc = subprocess.Popen(cmdline, stdout=subprocess.PIPE, stdin=subprocess.PIPE, stderr=subprocess.STDOUT)
except ValueError:
log.info("%s failed: %s", cmdline, ValueError)
proc.kill()

while True:
l = proc.stdout.readline()
if l == b"":
break
if b'Cipher is' in l:
log.info("Found: %s", l)
if expect:
if b'(NONE)' in l:
assert False
else:
proc.stdin.close()
assert True
else:
if b'(NONE)' in l:
assert True
else:
proc.stdin.close()
> assert False
E assert False

tickets/ticket48194_test.py:117: AssertionError
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket48194_test.py:40 ############################################### INFO  lib389:ticket48194_test.py:41 ####### Test Case 7 - Check nsSSL3Ciphers: +all,-TLS_RSA_WITH_AES_256_CBC_SHA256 with default allowWeakCipher INFO  lib389:ticket48194_test.py:42 ############################################### INFO  lib389.utils:ticket48194_test.py:267 ######################### Restarting the server ###################### INFO  lib389.utils:ticket48194_test.py:86 Testing DES-CBC3-SHA -- expect to handshake failed INFO  lib389.utils:ticket48194_test.py:92 Running cmdline: /usr/bin/openssl s_client -connect localhost:63601 -cipher DES-CBC3-SHA INFO  lib389.utils:ticket48194_test.py:105 Found: b'New, TLSv1.3, Cipher is TLS_AES_128_GCM_SHA256\n'
Failed tickets/ticket48194_test.py::test_run_8 6.75
topology_st = <lib389.topologies.TopologyMain object at 0x7fa5f4148f70>

def test_run_8(topology_st):
"""
Check nsSSL3Ciphers: default + allowWeakCipher: off
Strong Default ciphers are enabled.
"""
_header(topology_st, 'Test Case 9 - Check default nsSSL3Ciphers (default setting + allowWeakCipher: off)')

topology_st.standalone.simple_bind_s(DN_DM, PASSWORD)
topology_st.standalone.modify_s(ENCRYPTION_DN, [(ldap.MOD_REPLACE, 'nsSSL3Ciphers', b'default'),
(ldap.MOD_REPLACE, 'allowWeakCipher', b'off')])

log.info("\n######################### Restarting the server ######################\n")
topology_st.standalone.stop(timeout=10)
os.system('mv %s %s.48194_7' % (topology_st.standalone.errlog, topology_st.standalone.errlog))
os.system('touch %s' % (topology_st.standalone.errlog))
time.sleep(2)
topology_st.standalone.start(timeout=120)

> connectWithOpenssl(topology_st, 'DES-CBC3-SHA', False)

tickets/ticket48194_test.py:297:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

topology_st = <lib389.topologies.TopologyMain object at 0x7fa5f4148f70>
cipher = 'DES-CBC3-SHA', expect = False

def connectWithOpenssl(topology_st, cipher, expect):
"""
Connect with the given cipher
Condition:
If expect is True, the handshake should be successful.
If expect is False, the handshake should be refused with
access log: "Cannot communicate securely with peer:
no common encryption algorithm(s)."
"""
log.info("Testing %s -- expect to handshake %s", cipher, "successfully" if expect else "failed")

myurl = 'localhost:%s' % LDAPSPORT
cmdline = ['/usr/bin/openssl', 's_client', '-connect', myurl, '-cipher', cipher]

strcmdline = " ".join(cmdline)
log.info("Running cmdline: %s", strcmdline)

try:
proc = subprocess.Popen(cmdline, stdout=subprocess.PIPE, stdin=subprocess.PIPE, stderr=subprocess.STDOUT)
except ValueError:
log.info("%s failed: %s", cmdline, ValueError)
proc.kill()

while True:
l = proc.stdout.readline()
if l == b"":
break
if b'Cipher is' in l:
log.info("Found: %s", l)
if expect:
if b'(NONE)' in l:
assert False
else:
proc.stdin.close()
assert True
else:
if b'(NONE)' in l:
assert True
else:
proc.stdin.close()
> assert False
E assert False

tickets/ticket48194_test.py:117: AssertionError
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket48194_test.py:40 ############################################### INFO  lib389:ticket48194_test.py:41 ####### Test Case 9 - Check default nsSSL3Ciphers (default setting + allowWeakCipher: off) INFO  lib389:ticket48194_test.py:42 ############################################### INFO  lib389.utils:ticket48194_test.py:290 ######################### Restarting the server ###################### INFO  lib389.utils:ticket48194_test.py:86 Testing DES-CBC3-SHA -- expect to handshake failed INFO  lib389.utils:ticket48194_test.py:92 Running cmdline: /usr/bin/openssl s_client -connect localhost:63601 -cipher DES-CBC3-SHA INFO  lib389.utils:ticket48194_test.py:105 Found: b'New, TLSv1.3, Cipher is TLS_AES_128_GCM_SHA256\n'
Failed tickets/ticket48637_test.py::test_ticket48637 5.32
topology_st = <lib389.topologies.TopologyMain object at 0x7fa5efeac130>

def test_ticket48637(topology_st):
"""Test for entry cache corruption

This requires automember and managed entry plugins to be configured.

Then remove the group that automember would use to trigger a failure when
adding a new entry. Automember fails, and then managed entry also fails.

Make sure a base search on the entry returns error 32
"""

if DEBUGGING:
# Add debugging steps(if any)...
pass

#
# Add our setup entries
#
try:
topology_st.standalone.add_s(Entry((PEOPLE_OU, {
'objectclass': 'top organizationalunit'.split(),
'ou': 'people'})))
except ldap.ALREADY_EXISTS:
pass
except ldap.LDAPError as e:
log.fatal('Failed to add people ou: ' + str(e))
assert False

try:
topology_st.standalone.add_s(Entry((GROUP_OU, {
'objectclass': 'top organizationalunit'.split(),
'ou': 'groups'})))
except ldap.ALREADY_EXISTS:
pass
except ldap.LDAPError as e:
log.fatal('Failed to add groups ou: ' + str(e))
assert False

try:
topology_st.standalone.add_s(Entry((MEP_OU, {
'objectclass': 'top extensibleObject'.split(),
'ou': 'mep'})))
except ldap.LDAPError as e:
log.fatal('Failed to add MEP ou: ' + str(e))
assert False

try:
topology_st.standalone.add_s(Entry((MEP_TEMPLATE, {
'objectclass': 'top mepTemplateEntry'.split(),
'cn': 'mep template',
'mepRDNAttr': 'cn',
'mepStaticAttr': 'objectclass: groupofuniquenames',
'mepMappedAttr': 'cn: $uid'})))
except ldap.LDAPError as e:
log.fatal('Failed to add MEP ou: ' + str(e))
assert False

#
# Configure automember
#
try:
topology_st.standalone.add_s(Entry((AUTO_DN, {
'cn': 'All Users',
'objectclass': ['top', 'autoMemberDefinition'],
'autoMemberScope': 'dc=example,dc=com',
'autoMemberFilter': 'objectclass=person',
'autoMemberDefaultGroup': GROUP_DN,
'autoMemberGroupingAttr': 'uniquemember:dn'})))
except ldap.LDAPError as e:
log.fatal('Failed to configure automember plugin : ' + str(e))
assert False

#
# Configure managed entry plugin
#
try:
topology_st.standalone.add_s(Entry((MEP_DN, {
'cn': 'MEP Definition',
'objectclass': ['top', 'extensibleObject'],
'originScope': 'ou=people,dc=example,dc=com',
'originFilter': 'objectclass=person',
'managedBase': 'ou=groups,dc=example,dc=com',
'managedTemplate': MEP_TEMPLATE})))
except ldap.LDAPError as e:
log.fatal('Failed to configure managed entry plugin : ' + str(e))
assert False

#
# Restart DS
#
topology_st.standalone.restart(timeout=30)

#
# Add entry that should fail since the automember group does not exist
#
try:
topology_st.standalone.add_s(Entry((USER_DN, {
'uid': 'test',
'objectclass': ['top', 'person', 'extensibleObject'],
'sn': 'test',
'cn': 'test'})))
except ldap.LDAPError as e:
pass

#
# Search for the entry - it should not be returned
#
try:
entry = topology_st.standalone.search_s(USER_DN, ldap.SCOPE_SUBTREE,
'objectclass=*')
if entry:
log.fatal('Entry was incorrectly returned')
> assert False
E assert False

tickets/ticket48637_test.py:139: AssertionError
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
CRITICAL tests.tickets.ticket48637_test:ticket48637_test.py:138 Entry was incorrectly returned
Failed tickets/ticket48798_test.py::test_ticket48798 13.20
topology_st = <lib389.topologies.TopologyMain object at 0x7fa5efa7a400>

def test_ticket48798(topology_st):
"""
Test DH param sizes offered by DS.

"""
topology_st.standalone.enable_tls()

# Confirm that we have a connection, and that it has DH

# Open a socket to the port.
# Check the security settings.
> size = check_socket_dh_param_size(topology_st.standalone.host, topology_st.standalone.sslport)

tickets/ticket48798_test.py:46:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
tickets/ticket48798_test.py:23: in check_socket_dh_param_size
output = check_output(cmd, shell=True)
/usr/lib64/python3.8/subprocess.py:411: in check_output
return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

input = None, capture_output = False, timeout = None, check = True
popenargs = ('echo quit | openssl s_client -connect ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:63601 -msg -cipher DH | grep -A 1 ServerKeyExchange',)
kwargs = {'shell': True, 'stdout': -1}
process = <subprocess.Popen object at 0x7fa5efe04700>, stdout = b''
stderr = None, retcode = 1

def run(*popenargs,
input=None, capture_output=False, timeout=None, check=False, **kwargs):
"""Run command with arguments and return a CompletedProcess instance.

The returned instance will have attributes args, returncode, stdout and
stderr. By default, stdout and stderr are not captured, and those attributes
will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them.

If check is True and the exit code was non-zero, it raises a
CalledProcessError. The CalledProcessError object will have the return code
in the returncode attribute, and output & stderr attributes if those streams
were captured.

If timeout is given, and the process takes too long, a TimeoutExpired
exception will be raised.

There is an optional argument "input", allowing you to
pass bytes or a string to the subprocess's stdin. If you use this argument
you may not also use the Popen constructor's "stdin" argument, as
it will be used internally.

By default, all communication is in bytes, and therefore any "input" should
be bytes, and the stdout and stderr will be bytes. If in text mode, any
"input" should be a string, and stdout and stderr will be strings decoded
according to locale encoding, or by "encoding" if set. Text mode is
triggered by setting any of text, encoding, errors or universal_newlines.

The other arguments are the same as for the Popen constructor.
"""
if input is not None:
if kwargs.get('stdin') is not None:
raise ValueError('stdin and input arguments may not both be used.')
kwargs['stdin'] = PIPE

if capture_output:
if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None:
raise ValueError('stdout and stderr arguments may not be used '
'with capture_output.')
kwargs['stdout'] = PIPE
kwargs['stderr'] = PIPE

with Popen(*popenargs, **kwargs) as process:
try:
stdout, stderr = process.communicate(input, timeout=timeout)
except TimeoutExpired as exc:
process.kill()
if _mswindows:
# Windows accumulates the output in a single blocking
# read() call run on child threads, with the timeout
# being done in a join() on those threads. communicate()
# _after_ kill() is required to collect that and add it
# to the exception.
exc.stdout, exc.stderr = process.communicate()
else:
# POSIX _communicate already populated the output so
# far into the TimeoutExpired exception.
process.wait()
raise
except: # Including KeyboardInterrupt, communicate handled that.
process.kill()
# We don't call process.wait() as .__exit__ does that for us.
raise
retcode = process.poll()
if check and retcode:
> raise CalledProcessError(retcode, process.args,
output=stdout, stderr=stderr)
E subprocess.CalledProcessError: Command 'echo quit | openssl s_client -connect ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:63601 -msg -cipher DH | grep -A 1 ServerKeyExchange' returned non-zero exit status 1.

/usr/lib64/python3.8/subprocess.py:512: CalledProcessError
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
------------------------------Captured stderr call------------------------------
depth=1 C = AU, ST = Queensland, L = 389ds, O = testing, CN = ssca.389ds.example.com verify return:1 depth=0 C = AU, ST = Queensland, L = 389ds, O = testing, GN = 4ffa6e07-16f1-4ebb-99fa-66cbfe6854bd, CN = ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com verify error:num=7:certificate signature failure verify return:1 depth=0 C = AU, ST = Queensland, L = 389ds, O = testing, GN = 4ffa6e07-16f1-4ebb-99fa-66cbfe6854bd, CN = ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com verify return:1 DONE
Failed tickets/ticket48961_test.py::test_ticket48961_storagescheme 0.26
topology_st = <lib389.topologies.TopologyMain object at 0x7fa5effebfa0>

def test_ticket48961_storagescheme(topology_st):
"""
Test deleting of the storage scheme.
"""

default = topology_st.standalone.config.get_attr_val('passwordStorageScheme')
# Change it
topology_st.standalone.config.set('passwordStorageScheme', 'CLEAR')
# Now delete it
> topology_st.standalone.config.remove('passwordStorageScheme', None)

tickets/ticket48961_test.py:28:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/_mapped_object.py:316: in remove
self.set(key, value, action=ldap.MOD_DELETE)
/usr/local/lib/python3.8/site-packages/lib389/_mapped_object.py:446: in set
return self._instance.modify_ext_s(self._dn, [(action, key, value)],
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:613: in modify_ext_s
resp_type, resp_data, resp_msgid, resp_ctrls = self.result3(msgid,all=1,timeout=self.timeout)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:764: in result3
resp_type, resp_data, resp_msgid, decoded_resp_ctrls, retoid, retval = self.result4(
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:774: in result4
ldap_result = self._ldap_call(self._l.result4,msgid,all,timeout,add_ctrls,add_intermediates,add_extop)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:340: in _ldap_call
reraise(exc_type, exc_value, exc_traceback)
/usr/local/lib64/python3.8/site-packages/ldap/compat.py:46: in reraise
raise exc_value
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7fa5effebeb0>
func = <built-in method result4 of LDAP object at 0x7fa5efb4b810>
args = (5, 1, -1, 0, 0, 0), kwargs = {}, diagnostic_message_success = None
exc_type = None, exc_value = None, exc_traceback = None

def _ldap_call(self,func,*args,**kwargs):
"""
Wrapper method mainly for serializing calls into OpenLDAP libs
and trace logs
"""
self._ldap_object_lock.acquire()
if __debug__:
if self._trace_level>=1:
self._trace_file.write('*** %s %s - %s\n%s\n' % (
repr(self),
self._uri,
'.'.join((self.__class__.__name__,func.__name__)),
pprint.pformat((args,kwargs))
))
if self._trace_level>=9:
traceback.print_stack(limit=self._trace_stack_limit,file=self._trace_file)
diagnostic_message_success = None
try:
try:
> result = func(*args,**kwargs)
E ldap.OPERATIONS_ERROR: {'msgtype': 103, 'msgid': 5, 'result': 1, 'desc': 'Operations error', 'ctrls': [], 'info': 'passwordStorageScheme: deleting the value is not allowed.'}

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:324: OPERATIONS_ERROR
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Failed tickets/ticket48961_test.py::test_ticket48961_deleteall 0.00
topology_st = <lib389.topologies.TopologyMain object at 0x7fa5effebfa0>

def test_ticket48961_deleteall(topology_st):
"""
Test that we can delete all valid attrs, and that a few are rejected.
"""
attr_to_test = {
'nsslapd-listenhost': 'localhost',
'nsslapd-securelistenhost': 'localhost',
'nsslapd-allowed-sasl-mechanisms': 'GSSAPI',
'nsslapd-svrtab': 'Some bogus data', # This one could reset?
}
attr_to_fail = {
# These are the values that should always be dn dse.ldif too
'nsslapd-localuser': 'dirsrv',
'nsslapd-defaultnamingcontext': 'dc=example,dc=com', # Can't delete
'nsslapd-accesslog': '/opt/dirsrv/var/log/dirsrv/slapd-standalone/access',
'nsslapd-auditlog': '/opt/dirsrv/var/log/dirsrv/slapd-standalone/audit',
'nsslapd-errorlog': '/opt/dirsrv/var/log/dirsrv/slapd-standalone/errors',
'nsslapd-tmpdir': '/tmp',
'nsslapd-rundir': '/opt/dirsrv/var/run/dirsrv',
'nsslapd-bakdir': '/opt/dirsrv/var/lib/dirsrv/slapd-standalone/bak',
'nsslapd-certdir': '/opt/dirsrv/etc/dirsrv/slapd-standalone',
'nsslapd-instancedir': '/opt/dirsrv/lib/dirsrv/slapd-standalone',
'nsslapd-ldifdir': '/opt/dirsrv/var/lib/dirsrv/slapd-standalone/ldif',
'nsslapd-lockdir': '/opt/dirsrv/var/lock/dirsrv/slapd-standalone',
'nsslapd-schemadir': '/opt/dirsrv/etc/dirsrv/slapd-standalone/schema',
'nsslapd-workingdir': '/opt/dirsrv/var/log/dirsrv/slapd-standalone',
'nsslapd-localhost': 'localhost.localdomain',
# These can't be reset, but might be in dse.ldif. Probably in libglobs.
'nsslapd-certmap-basedn': 'cn=certmap,cn=config',
'nsslapd-port': '38931', # Can't delete
'nsslapd-secureport': '636', # Can't delete
'nsslapd-conntablesize': '1048576',
'nsslapd-rootpw': '{SSHA512}...',
# These are hardcoded server magic.
'nsslapd-hash-filters': 'off', # Can't delete
'nsslapd-requiresrestart': 'cn=config:nsslapd-port', # Can't change
'nsslapd-plugin': 'cn=case ignore string syntax,cn=plugins,cn=config', # Can't change
'nsslapd-privatenamespaces': 'cn=schema', # Can't change
'nsslapd-allowed-to-delete-attrs': 'None', # Can't delete
'nsslapd-accesslog-list': 'List!', # Can't delete
'nsslapd-auditfaillog-list': 'List!',
'nsslapd-auditlog-list': 'List!',
'nsslapd-errorlog-list': 'List!',
'nsslapd-config': 'cn=config',
'nsslapd-versionstring': '389-Directory/1.3.6.0',
'objectclass': '',
'cn': '',
# These are the odd values
'nsslapd-backendconfig': 'cn=config,cn=userRoot,cn=ldbm database,cn=plugins,cn=config', # Doesn't exist?
'nsslapd-betype': 'ldbm database', # Doesn't exist?
'nsslapd-connection-buffer': 1, # Has an ldap problem
'nsslapd-malloc-mmap-threshold': '-10', # Defunct anyway
'nsslapd-malloc-mxfast': '-10',
'nsslapd-malloc-trim-threshold': '-10',
'nsslapd-referralmode': '',
'nsslapd-saslpath': '',
'passwordadmindn': '',
}

> config_entry = topology_st.standalone.config.raw_entry()

tickets/ticket48961_test.py:101:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.config.Config object at 0x7fa5efb46430>, name = 'raw_entry'

def __getattr__(self, name):
"""This enables a bit of magic to allow us to wrap any function ending with
_json to it's form without json, then transformed. It means your function
*must* return it's values as a dict of:

{ attr : [val, val, ...], attr : [], ... }
to be supported.
"""

if (name.endswith('_json')):
int_name = name.replace('_json', '')
pfunc = partial(self._jsonify, getattr(self, int_name))
return pfunc
else:
> raise AttributeError("'%s' object has no attribute '%s'" % (self.__class__.__name__, name))
E AttributeError: 'Config' object has no attribute 'raw_entry'

/usr/local/lib/python3.8/site-packages/lib389/_mapped_object.py:199: AttributeError
Failed tickets/ticket48973_test.py::test_ticket48973_homeDirectory_indexing 7.74
topology = <tests.tickets.ticket48973_test.TopologyStandalone object at 0x7fa5efb46ca0>

def test_ticket48973_homeDirectory_indexing(topology):
"""
Check that homedirectory is indexed with syntax (ces)
- triggers index
- no failure on index
- do a search indexed with exact value (ces) and no default_mr_indexer_create warning
- do a search indexed with uppercase value (ces) and no default_mr_indexer_create warning
"""
entry_ext = 1

try:
ent = topology.standalone.getEntry(HOMEDIRECTORY_INDEX, ldap.SCOPE_BASE)
except ldap.NO_SUCH_OBJECT:
topology.standalone.add_s(Entry((HOMEDIRECTORY_INDEX, {
'objectclass': "top nsIndex".split(),
'cn': HOMEDIRECTORY_CN,
'nsSystemIndex': 'false',
'nsIndexType': 'eq'})))

args = {TASK_WAIT: True}
topology.standalone.tasks.reindex(suffix=SUFFIX, attrname='homeDirectory', args=args)

log.info("Check indexing succeeded with no specified matching rule")
assert not _find_first_indexing_failure(topology, "unknown or invalid matching rule")
assert not _find_first_indexing_failure(topology, "default_mr_indexer_create: warning")
assert not _find_first_indexing_failure(topology, "default_mr_indexer_create - Plugin .* does not handle")

_check_entry(topology, filterHead="homeDirectory", filterValueUpper=False, entry_ext=entry_ext,found=True, indexed=True)

> _check_entry(topology, filterHead="homeDirectory:caseExactIA5Match:", filterValueUpper=False, entry_ext=entry_ext, found=True, indexed=False)

tickets/ticket48973_test.py:251:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

topology = <tests.tickets.ticket48973_test.TopologyStandalone object at 0x7fa5efb46ca0>
filterHead = 'homeDirectory:caseExactIA5Match:', filterValueUpper = False
entry_ext = 1, found = True, indexed = False

def _check_entry(topology, filterHead=None, filterValueUpper=False, entry_ext=None, found=False, indexed=False):
# Search with CES with exact value -> find an entry + indexed
if filterValueUpper:
homehead = HOMEHEAD.upper()
else:
homehead = HOMEHEAD
searchedHome = "%s%d" % (homehead, entry_ext)
Filter = "(%s=%s)" % (filterHead, searchedHome)
log.info("Search %s" % Filter)
ents = topology.standalone.search_s(SUFFIX, ldap.SCOPE_SUBTREE, Filter)
if found:
assert len(ents) == 1
assert ents[0].hasAttr('homedirectory')
valueHome = ensure_bytes("%s%d" % (HOMEHEAD, entry_ext))
assert valueHome in ents[0].getValues('homedirectory')
else:
assert len(ents) == 0

result = _find_next_notes(topology, Filter)
log.info("result=%s" % result)
if indexed:
assert not "notes=U" in result
else:
> assert "notes=U" in result
E AssertionError: assert 'notes=U' in '[20/Oct/2020:21:41:51.534943846 -0400] conn=1 op=2 RESULT err=0 tag=101 nentries=1 wtime=0.000199514 optime=0.004148688 etime=0.004345413\n'

tickets/ticket48973_test.py:188: AssertionError
-------------------------------Captured log call--------------------------------
INFO  lib389:tasks.py:798 Index task index_attrs_10202020_214147 completed successfully INFO  tests.tickets.ticket48973_test:ticket48973_test.py:244 Check indexing succeeded with no specified matching rule INFO  tests.tickets.ticket48973_test:ticket48973_test.py:173 Search (homeDirectory=/home/xyz_1) INFO  tests.tickets.ticket48973_test:ticket48973_test.py:184 result=[20/Oct/2020:21:41:48.972865264 -0400] conn=1 op=9 RESULT err=0 tag=101 nentries=1 wtime=0.000265157 optime=0.000648971 etime=0.000910020 INFO  tests.tickets.ticket48973_test:ticket48973_test.py:173 Search (homeDirectory:caseExactIA5Match:=/home/xyz_1) INFO  tests.tickets.ticket48973_test:ticket48973_test.py:184 result=[20/Oct/2020:21:41:51.534943846 -0400] conn=1 op=2 RESULT err=0 tag=101 nentries=1 wtime=0.000199514 optime=0.004148688 etime=0.004345413
Failed tickets/ticket49192_test.py::test_ticket49192 0.00
topo = <lib389.topologies.TopologyMain object at 0x7fa5efbbcc10>

def test_ticket49192(topo):
"""Trigger deadlock when removing suffix
"""

#
# Create a second suffix/backend
#
log.info('Creating second backend...')
> topo.standalone.backends.create(None, properties={
BACKEND_NAME: "Second_Backend",
'suffix': "o=hang.com",
})

tickets/ticket49192_test.py:35:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/_mapped_object.py:1169: in create
return co.create(rdn, properties, self._basedn)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.backend.Backend object at 0x7fa5effeb0d0>, dn = None
properties = {'name': 'Second_Backend', 'suffix': 'o=hang.com'}
basedn = 'cn=ldbm database,cn=plugins,cn=config'

def create(self, dn=None, properties=None, basedn=DN_LDBM):
"""Add a new backend entry, create mapping tree,
and, if requested, sample entries

:param dn: DN of the new entry
:type dn: str
:param properties: Attributes and parameters for the new entry
:type properties: dict
:param basedn: Base DN of the new entry
:type basedn: str

:returns: DSLdapObject of the created entry
"""

sample_entries = False
parent_suffix = False

# normalize suffix (remove spaces between comps)
if dn is not None:
dn_comps = ldap.dn.explode_dn(dn.lower())
dn = ",".join(dn_comps)

if properties is not None:
> suffix_dn = properties['nsslapd-suffix'].lower()
E KeyError: 'nsslapd-suffix'

/usr/local/lib/python3.8/site-packages/lib389/backend.py:590: KeyError
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket49192_test:ticket49192_test.py:34 Creating second backend...
Failed tickets/ticket49303_test.py::test_ticket49303 16.04
topo = <lib389.topologies.TopologyMain object at 0x7fa5ef733fa0>

def test_ticket49303(topo):
"""
Test the nsTLSAllowClientRenegotiation setting.
"""
sslport = SECUREPORT_STANDALONE1

log.info("Ticket 49303 - Allow disabling of SSL renegotiation")

# No value set, defaults to reneg allowed
enable_ssl(topo.standalone, sslport)
> assert try_reneg(HOST_STANDALONE1, sslport) is True
E AssertionError: assert False is True
E + where False = try_reneg('LOCALHOST', 63601)

tickets/ticket49303_test.py:88: AssertionError
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket49303_test:ticket49303_test.py:84 Ticket 49303 - Allow disabling of SSL renegotiation
XFailed suites/acl/syntax_test.py::test_aci_invalid_syntax_fail[test_targattrfilters_18] 0.01
topo = <lib389.topologies.TopologyMain object at 0x7fa5f8bf2e80>
real_value = '(target = ldap:///cn=Jeff Vedder,ou=Product Development,dc=example,dc=com)(targetattr=*)(version 3.0; acl "Name of th...3123123123123123123123123123123123123123123123123123123123123123123123123123123123123123123123123123123123123123123";)'

@pytest.mark.xfail(reason='https://bugzilla.redhat.com/show_bug.cgi?id=1691473')
@pytest.mark.parametrize("real_value", [a[1] for a in FAILED],
ids=[a[0] for a in FAILED])
def test_aci_invalid_syntax_fail(topo, real_value):
"""

Try to set wrong ACI syntax.

:id: 83c40784-fff5-49c8-9535-7064c9c19e7e
:parametrized: yes
:setup: Standalone Instance
:steps:
1. Create ACI
2. Try to setup the ACI with Instance
:expectedresults:
1. It should pass
2. It should not pass
"""
domain = Domain(topo.standalone, DEFAULT_SUFFIX)
with pytest.raises(ldap.INVALID_SYNTAX):
> domain.add("aci", real_value)
E Failed: DID NOT RAISE <class 'ldap.INVALID_SYNTAX'>

suites/acl/syntax_test.py:215: Failed
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
XFailed suites/acl/syntax_test.py::test_aci_invalid_syntax_fail[test_targattrfilters_20] 0.15
topo = <lib389.topologies.TopologyMain object at 0x7fa5f8bf2e80>
real_value = '(target = ldap:///cn=Jeff Vedder,ou=Product Development,dc=example,dc=com)(targetattr=*)(version 3.0; acl "Name of the ACI"; deny(write)userdns="ldap:///anyone";)'

@pytest.mark.xfail(reason='https://bugzilla.redhat.com/show_bug.cgi?id=1691473')
@pytest.mark.parametrize("real_value", [a[1] for a in FAILED],
ids=[a[0] for a in FAILED])
def test_aci_invalid_syntax_fail(topo, real_value):
"""

Try to set wrong ACI syntax.

:id: 83c40784-fff5-49c8-9535-7064c9c19e7e
:parametrized: yes
:setup: Standalone Instance
:steps:
1. Create ACI
2. Try to setup the ACI with Instance
:expectedresults:
1. It should pass
2. It should not pass
"""
domain = Domain(topo.standalone, DEFAULT_SUFFIX)
with pytest.raises(ldap.INVALID_SYNTAX):
> domain.add("aci", real_value)
E Failed: DID NOT RAISE <class 'ldap.INVALID_SYNTAX'>

suites/acl/syntax_test.py:215: Failed
XFailed suites/acl/syntax_test.py::test_aci_invalid_syntax_fail[test_bind_rule_set_with_more_than_three] 0.15
topo = <lib389.topologies.TopologyMain object at 0x7fa5f8bf2e80>
real_value = '(target = ldap:///dc=example,dc=com)(targetattr=*)(version 3.0; acl "Name of the ACI"; deny absolute (all)userdn="ldap:////////anyone";)'

@pytest.mark.xfail(reason='https://bugzilla.redhat.com/show_bug.cgi?id=1691473')
@pytest.mark.parametrize("real_value", [a[1] for a in FAILED],
ids=[a[0] for a in FAILED])
def test_aci_invalid_syntax_fail(topo, real_value):
"""

Try to set wrong ACI syntax.

:id: 83c40784-fff5-49c8-9535-7064c9c19e7e
:parametrized: yes
:setup: Standalone Instance
:steps:
1. Create ACI
2. Try to setup the ACI with Instance
:expectedresults:
1. It should pass
2. It should not pass
"""
domain = Domain(topo.standalone, DEFAULT_SUFFIX)
with pytest.raises(ldap.INVALID_SYNTAX):
> domain.add("aci", real_value)
E Failed: DID NOT RAISE <class 'ldap.INVALID_SYNTAX'>

suites/acl/syntax_test.py:215: Failed
XFailed suites/acl/userattr_test.py::test_mod_see_also_positive[(LEVEL_3, CHILDREN)] 0.06
topo = <lib389.topologies.TopologyMain object at 0x7fa5f8d06040>
_add_user = None, user = 'uid=Grandparent,ou=Inheritance,dc=example,dc=com'
entry = 'ou=CHILDREN,ou=PARENTS,ou=GRANDPARENTS,ou=ANCESTORS,ou=Inheritance,dc=example,dc=com'

@pytest.mark.parametrize("user,entry", [
(CAN, ROLEDNACCESS),
(CAN, USERDNACCESS),
(CAN, GROUPDNACCESS),
(CAN, LDAPURLACCESS),
(CAN, ATTRNAMEACCESS),
(LEVEL_0, OU_2),
(LEVEL_1, ANCESTORS),
(LEVEL_2, GRANDPARENTS),
(LEVEL_4, OU_2),
(LEVEL_4, ANCESTORS),
(LEVEL_4, GRANDPARENTS),
(LEVEL_4, PARENTS),
(LEVEL_4, CHILDREN),
pytest.param(LEVEL_3, CHILDREN, marks=pytest.mark.xfail(reason="May be some bug")),
], ids=[
"(CAN,ROLEDNACCESS)",
"(CAN,USERDNACCESS)",
"(CAN,GROUPDNACCESS)",
"(CAN,LDAPURLACCESS)",
"(CAN,ATTRNAMEACCESS)",
"(LEVEL_0, OU_2)",
"(LEVEL_1,ANCESTORS)",
"(LEVEL_2,GRANDPARENTS)",
"(LEVEL_4,OU_2)",
"(LEVEL_4, ANCESTORS)",
"(LEVEL_4,GRANDPARENTS)",
"(LEVEL_4,PARENTS)",
"(LEVEL_4,CHILDREN)",
"(LEVEL_3, CHILDREN)"
])
def test_mod_see_also_positive(topo, _add_user, user, entry):
"""
Try to set seeAlso on entry with binding specific user, it will success
as per the ACI.

:id: 65745426-7a01-11e8-8ac2-8c16451d917b
:parametrized: yes
:setup: Standalone Instance
:steps:
1. Add test entry
2. Add ACI
3. User should follow ACI role
:expectedresults:
1. Entry should be added
2. Operation should succeed
3. Operation should succeed
"""
conn = UserAccount(topo.standalone, user).bind(PW_DM)
> UserAccount(conn, entry).replace('seeAlso', 'cn=1')

suites/acl/userattr_test.py:216:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/_mapped_object.py:280: in replace
self.set(key, value, action=ldap.MOD_REPLACE)
/usr/local/lib/python3.8/site-packages/lib389/_mapped_object.py:446: in set
return self._instance.modify_ext_s(self._dn, [(action, key, value)],
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:613: in modify_ext_s
resp_type, resp_data, resp_msgid, resp_ctrls = self.result3(msgid,all=1,timeout=self.timeout)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:764: in result3
resp_type, resp_data, resp_msgid, decoded_resp_ctrls, retoid, retval = self.result4(
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:774: in result4
ldap_result = self._ldap_call(self._l.result4,msgid,all,timeout,add_ctrls,add_intermediates,add_extop)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:340: in _ldap_call
reraise(exc_type, exc_value, exc_traceback)
/usr/local/lib64/python3.8/site-packages/ldap/compat.py:46: in reraise
raise exc_value
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7fa5f8bdf760>
func = <built-in method result4 of LDAP object at 0x7fa5f84ddae0>
args = (5, 1, -1, 0, 0, 0), kwargs = {}, diagnostic_message_success = None
exc_type = None, exc_value = None, exc_traceback = None

def _ldap_call(self,func,*args,**kwargs):
"""
Wrapper method mainly for serializing calls into OpenLDAP libs
and trace logs
"""
self._ldap_object_lock.acquire()
if __debug__:
if self._trace_level>=1:
self._trace_file.write('*** %s %s - %s\n%s\n' % (
repr(self),
self._uri,
'.'.join((self.__class__.__name__,func.__name__)),
pprint.pformat((args,kwargs))
))
if self._trace_level>=9:
traceback.print_stack(limit=self._trace_stack_limit,file=self._trace_file)
diagnostic_message_success = None
try:
try:
> result = func(*args,**kwargs)
E ldap.INSUFFICIENT_ACCESS: {'msgtype': 103, 'msgid': 5, 'result': 50, 'desc': 'Insufficient access', 'ctrls': [], 'info': "Insufficient 'write' privilege to the 'seeAlso' attribute of entry 'ou=children,ou=parents,ou=grandparents,ou=ancestors,ou=inheritance,dc=example,dc=com'.\n"}

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:324: INSUFFICIENT_ACCESS
XFailed suites/config/config_test.py::test_defaultnamingcontext_1 0.20
topo = <lib389.topologies.TopologyMain object at 0x7fa5f6e6fb80>

@pytest.mark.xfail(reason="This may fail due to bug 1610234")
def test_defaultnamingcontext_1(topo):
"""This test case should be part of function test_defaultnamingcontext
Please move it back after we have a fix for bug 1610234
"""
log.info("Remove the original suffix which is currently nsslapd-defaultnamingcontext"
"and check nsslapd-defaultnamingcontext become empty.")

""" Please remove these declarations after moving the test
to function test_defaultnamingcontext
"""
backends = Backends(topo.standalone)
test_db2 = 'test2_db'
test_suffix2 = 'dc=test2,dc=com'
b2 = backends.create(properties={'cn': test_db2,
'nsslapd-suffix': test_suffix2})
b2.delete()
> assert topo.standalone.config.get_attr_val_utf8('nsslapd-defaultnamingcontext') == ' '
E AssertionError: assert 'dc=example,dc=com' == ' '
E Strings contain only whitespace, escaping them using repr()
E - ' '
E + 'dc=example,dc=com'

suites/config/config_test.py:280: AssertionError
-------------------------------Captured log call--------------------------------
INFO  tests.suites.config.config_test:config_test.py:268 Remove the original suffix which is currently nsslapd-defaultnamingcontextand check nsslapd-defaultnamingcontext become empty.
XFailed suites/export/export_test.py::test_dbtasks_db2ldif_with_non_accessible_ldif_file_path_output 3.50
topo = <lib389.topologies.TopologyMain object at 0x7fa5f7007ac0>

@pytest.mark.bz1860291
@pytest.mark.xfail(reason="bug 1860291")
@pytest.mark.skipif(ds_is_older("1.3.10", "1.4.2"), reason="Not implemented")
def test_dbtasks_db2ldif_with_non_accessible_ldif_file_path_output(topo):
"""Export with db2ldif, giving a ldif file path which can't be accessed by the user (dirsrv by default)

:id: fcc63387-e650-40a7-b643-baa68c190037
:setup: Standalone Instance - entries imported in the db
:steps:
1. Stop the server
2. Launch db2ldif with a non accessible ldif file path
3. check the error reported in the command output
:expected results:
1. Operation successful
2. Operation properly fails
3. An clear error message is reported as output of the cli
"""
export_ldif = '/tmp/nonexistent/export.ldif'

log.info("Stopping the instance...")
topo.standalone.stop()

log.info("Performing an offline export to a non accessible ldif file path - should fail and output a clear error message")
expected_output="No such file or directory"
> run_db2ldif_and_clear_logs(topo, topo.standalone, DEFAULT_BENAME, export_ldif, expected_output)

suites/export/export_test.py:150:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

topology = <lib389.topologies.TopologyMain object at 0x7fa5f7007ac0>
instance = <lib389.DirSrv object at 0x7fa5f7007a60>, backend = 'userRoot'
ldif = '/tmp/nonexistent/export.ldif', output_msg = 'No such file or directory'
encrypt = False, repl = False

def run_db2ldif_and_clear_logs(topology, instance, backend, ldif, output_msg, encrypt=False, repl=False):
args = FakeArgs()
args.instance = instance.serverid
args.backend = backend
args.encrypted = encrypt
args.replication = repl
args.ldif = ldif

dbtasks_db2ldif(instance, topology.logcap.log, args)

log.info('checking output msg')
if not topology.logcap.contains(output_msg):
log.error('The output message is not the expected one')
> assert False
E assert False

suites/export/export_test.py:36: AssertionError
------------------------------Captured stderr call------------------------------
ldiffile: /tmp/nonexistent/export.ldif
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:export_test.py:145 Stopping the instance... INFO  lib389.utils:export_test.py:148 Performing an offline export to a non accessible ldif file path - should fail and output a clear error message CRITICAL LogCapture:dbtasks.py:40 db2ldif failed INFO  lib389.utils:export_test.py:33 checking output msg ERROR  lib389.utils:export_test.py:35 The output message is not the expected one
XFailed suites/replication/replica_config_test.py::test_agmt_num_add[nsds5ReplicaPort-0-65535-9999999999999999999999999999999999999999999999999999999999999999999-invalid-389] 0.04
topo = <lib389.topologies.TopologyMain object at 0x7fa5f5498460>
attr = 'nsds5ReplicaPort', too_small = '0', too_big = '65535'
overflow = '9999999999999999999999999999999999999999999999999999999999999999999'
notnum = 'invalid', valid = '389'

@pytest.mark.xfail(reason="Agreement validation current does not work.")
@pytest.mark.parametrize("attr, too_small, too_big, overflow, notnum, valid", agmt_attrs)
def test_agmt_num_add(topo, attr, too_small, too_big, overflow, notnum, valid):
"""Test all the number values you can set for a replica config entry

:id: a8b47d4a-a089-4d70-8070-e6181209bf94
:parametrized: yes
:setup: standalone instance
:steps:
1. Use a value that is too small
2. Use a value that is too big
3. Use a value that overflows the int
4. Use a value with character value (not a number)
5. Use a valid value
:expectedresults:
1. Add is rejected
2. Add is rejected
3. Add is rejected
4. Add is rejected
5. Add is allowed
"""

agmt_reset(topo)
replica = replica_setup(topo)

agmts = Agreements(topo.standalone, basedn=replica.dn)

# Test too small
perform_invalid_create(agmts, agmt_dict, attr, too_small)
# Test too big
> perform_invalid_create(agmts, agmt_dict, attr, too_big)

suites/replication/replica_config_test.py:217:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

many = <lib389.agreement.Agreements object at 0x7fa5f4f70c10>
properties = {'cn': 'test_agreement', 'nsDS5ReplicaBindDN': 'uid=tester', 'nsDS5ReplicaBindMethod': 'SIMPLE', 'nsDS5ReplicaHost': 'localhost.localdomain', ...}
attr = 'nsds5ReplicaPort', value = '65535'

def perform_invalid_create(many, properties, attr, value):
my_properties = copy.deepcopy(properties)
my_properties[attr] = value
with pytest.raises(ldap.LDAPError) as ei:
> many.create(properties=my_properties)
E Failed: DID NOT RAISE <class 'ldap.LDAPError'>

suites/replication/replica_config_test.py:108: Failed
XFailed suites/replication/replica_config_test.py::test_agmt_num_modify[nsds5ReplicaPort-0-65535-9999999999999999999999999999999999999999999999999999999999999999999-invalid-389] 0.14
topo = <lib389.topologies.TopologyMain object at 0x7fa5f5498460>
attr = 'nsds5ReplicaPort', too_small = '0', too_big = '65535'
overflow = '9999999999999999999999999999999999999999999999999999999999999999999'
notnum = 'invalid', valid = '389'

@pytest.mark.xfail(reason="Agreement validation current does not work.")
@pytest.mark.parametrize("attr, too_small, too_big, overflow, notnum, valid", agmt_attrs)
def test_agmt_num_modify(topo, attr, too_small, too_big, overflow, notnum, valid):
"""Test all the number values you can set for a replica config entry

:id: a8b47d4a-a089-4d70-8070-e6181209bf95
:parametrized: yes
:setup: standalone instance
:steps:
1. Replace a value that is too small
2. Replace a value that is too big
3. Replace a value that overflows the int
4. Replace a value with character value (not a number)
5. Replace a vlue with a valid value
:expectedresults:
1. Value is rejected
2. Value is rejected
3. Value is rejected
4. Value is rejected
5. Value is allowed
"""

agmt = agmt_setup(topo)

# Value too small
> perform_invalid_modify(agmt, attr, too_small)

suites/replication/replica_config_test.py:253:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

o = <lib389.agreement.Agreement object at 0x7fa5f52746d0>
attr = 'nsds5ReplicaPort', value = '0'

def perform_invalid_modify(o, attr, value):
with pytest.raises(ldap.LDAPError) as ei:
> o.replace(attr, value)
E Failed: DID NOT RAISE <class 'ldap.LDAPError'>

suites/replication/replica_config_test.py:113: Failed
XFailed suites/replication/replica_config_test.py::test_agmt_num_modify[nsds5ReplicaTimeout--1-9223372036854775807-9999999999999999999999999999999999999999999999999999999999999999999-invalid-6] 0.16
topo = <lib389.topologies.TopologyMain object at 0x7fa5f5498460>
attr = 'nsds5ReplicaTimeout', too_small = '-1', too_big = '9223372036854775807'
overflow = '9999999999999999999999999999999999999999999999999999999999999999999'
notnum = 'invalid', valid = '6'

@pytest.mark.xfail(reason="Agreement validation current does not work.")
@pytest.mark.parametrize("attr, too_small, too_big, overflow, notnum, valid", agmt_attrs)
def test_agmt_num_modify(topo, attr, too_small, too_big, overflow, notnum, valid):
"""Test all the number values you can set for a replica config entry

:id: a8b47d4a-a089-4d70-8070-e6181209bf95
:parametrized: yes
:setup: standalone instance
:steps:
1. Replace a value that is too small
2. Replace a value that is too big
3. Replace a value that overflows the int
4. Replace a value with character value (not a number)
5. Replace a vlue with a valid value
:expectedresults:
1. Value is rejected
2. Value is rejected
3. Value is rejected
4. Value is rejected
5. Value is allowed
"""

agmt = agmt_setup(topo)

# Value too small
perform_invalid_modify(agmt, attr, too_small)
# Value too big
> perform_invalid_modify(agmt, attr, too_big)

suites/replication/replica_config_test.py:255:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

o = <lib389.agreement.Agreement object at 0x7fa5f526fac0>
attr = 'nsds5ReplicaTimeout', value = '9223372036854775807'

def perform_invalid_modify(o, attr, value):
with pytest.raises(ldap.LDAPError) as ei:
> o.replace(attr, value)
E Failed: DID NOT RAISE <class 'ldap.LDAPError'>

suites/replication/replica_config_test.py:113: Failed
XFailed suites/replication/replica_config_test.py::test_agmt_num_modify[nsds5ReplicaBusyWaitTime--1-9223372036854775807-9999999999999999999999999999999999999999999999999999999999999999999-invalid-6] 0.15
topo = <lib389.topologies.TopologyMain object at 0x7fa5f5498460>
attr = 'nsds5ReplicaBusyWaitTime', too_small = '-1'
too_big = '9223372036854775807'
overflow = '9999999999999999999999999999999999999999999999999999999999999999999'
notnum = 'invalid', valid = '6'

@pytest.mark.xfail(reason="Agreement validation current does not work.")
@pytest.mark.parametrize("attr, too_small, too_big, overflow, notnum, valid", agmt_attrs)
def test_agmt_num_modify(topo, attr, too_small, too_big, overflow, notnum, valid):
"""Test all the number values you can set for a replica config entry

:id: a8b47d4a-a089-4d70-8070-e6181209bf95
:parametrized: yes
:setup: standalone instance
:steps:
1. Replace a value that is too small
2. Replace a value that is too big
3. Replace a value that overflows the int
4. Replace a value with character value (not a number)
5. Replace a vlue with a valid value
:expectedresults:
1. Value is rejected
2. Value is rejected
3. Value is rejected
4. Value is rejected
5. Value is allowed
"""

agmt = agmt_setup(topo)

# Value too small
perform_invalid_modify(agmt, attr, too_small)
# Value too big
> perform_invalid_modify(agmt, attr, too_big)

suites/replication/replica_config_test.py:255:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

o = <lib389.agreement.Agreement object at 0x7fa5f52aa2b0>
attr = 'nsds5ReplicaBusyWaitTime', value = '9223372036854775807'

def perform_invalid_modify(o, attr, value):
with pytest.raises(ldap.LDAPError) as ei:
> o.replace(attr, value)
E Failed: DID NOT RAISE <class 'ldap.LDAPError'>

suites/replication/replica_config_test.py:113: Failed
XFailed suites/replication/replica_config_test.py::test_agmt_num_modify[nsds5ReplicaSessionPauseTime--1-9223372036854775807-9999999999999999999999999999999999999999999999999999999999999999999-invalid-6] 0.15
topo = <lib389.topologies.TopologyMain object at 0x7fa5f5498460>
attr = 'nsds5ReplicaSessionPauseTime', too_small = '-1'
too_big = '9223372036854775807'
overflow = '9999999999999999999999999999999999999999999999999999999999999999999'
notnum = 'invalid', valid = '6'

@pytest.mark.xfail(reason="Agreement validation current does not work.")
@pytest.mark.parametrize("attr, too_small, too_big, overflow, notnum, valid", agmt_attrs)
def test_agmt_num_modify(topo, attr, too_small, too_big, overflow, notnum, valid):
"""Test all the number values you can set for a replica config entry

:id: a8b47d4a-a089-4d70-8070-e6181209bf95
:parametrized: yes
:setup: standalone instance
:steps:
1. Replace a value that is too small
2. Replace a value that is too big
3. Replace a value that overflows the int
4. Replace a value with character value (not a number)
5. Replace a vlue with a valid value
:expectedresults:
1. Value is rejected
2. Value is rejected
3. Value is rejected
4. Value is rejected
5. Value is allowed
"""

agmt = agmt_setup(topo)

# Value too small
perform_invalid_modify(agmt, attr, too_small)
# Value too big
> perform_invalid_modify(agmt, attr, too_big)

suites/replication/replica_config_test.py:255:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

o = <lib389.agreement.Agreement object at 0x7fa5f524e640>
attr = 'nsds5ReplicaSessionPauseTime', value = '9223372036854775807'

def perform_invalid_modify(o, attr, value):
with pytest.raises(ldap.LDAPError) as ei:
> o.replace(attr, value)
E Failed: DID NOT RAISE <class 'ldap.LDAPError'>

suites/replication/replica_config_test.py:113: Failed
XFailed suites/replication/replica_config_test.py::test_agmt_num_modify[nsds5ReplicaFlowControlWindow--1-9223372036854775807-9999999999999999999999999999999999999999999999999999999999999999999-invalid-6] 0.18
topo = <lib389.topologies.TopologyMain object at 0x7fa5f5498460>
attr = 'nsds5ReplicaFlowControlWindow', too_small = '-1'
too_big = '9223372036854775807'
overflow = '9999999999999999999999999999999999999999999999999999999999999999999'
notnum = 'invalid', valid = '6'

@pytest.mark.xfail(reason="Agreement validation current does not work.")
@pytest.mark.parametrize("attr, too_small, too_big, overflow, notnum, valid", agmt_attrs)
def test_agmt_num_modify(topo, attr, too_small, too_big, overflow, notnum, valid):
"""Test all the number values you can set for a replica config entry

:id: a8b47d4a-a089-4d70-8070-e6181209bf95
:parametrized: yes
:setup: standalone instance
:steps:
1. Replace a value that is too small
2. Replace a value that is too big
3. Replace a value that overflows the int
4. Replace a value with character value (not a number)
5. Replace a vlue with a valid value
:expectedresults:
1. Value is rejected
2. Value is rejected
3. Value is rejected
4. Value is rejected
5. Value is allowed
"""

agmt = agmt_setup(topo)

# Value too small
perform_invalid_modify(agmt, attr, too_small)
# Value too big
> perform_invalid_modify(agmt, attr, too_big)

suites/replication/replica_config_test.py:255:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

o = <lib389.agreement.Agreement object at 0x7fa5f52421f0>
attr = 'nsds5ReplicaFlowControlWindow', value = '9223372036854775807'

def perform_invalid_modify(o, attr, value):
with pytest.raises(ldap.LDAPError) as ei:
> o.replace(attr, value)
E Failed: DID NOT RAISE <class 'ldap.LDAPError'>

suites/replication/replica_config_test.py:113: Failed
XFailed suites/replication/replica_config_test.py::test_agmt_num_modify[nsds5ReplicaFlowControlPause--1-9223372036854775807-9999999999999999999999999999999999999999999999999999999999999999999-invalid-6] 0.16
topo = <lib389.topologies.TopologyMain object at 0x7fa5f5498460>
attr = 'nsds5ReplicaFlowControlPause', too_small = '-1'
too_big = '9223372036854775807'
overflow = '9999999999999999999999999999999999999999999999999999999999999999999'
notnum = 'invalid', valid = '6'

@pytest.mark.xfail(reason="Agreement validation current does not work.")
@pytest.mark.parametrize("attr, too_small, too_big, overflow, notnum, valid", agmt_attrs)
def test_agmt_num_modify(topo, attr, too_small, too_big, overflow, notnum, valid):
"""Test all the number values you can set for a replica config entry

:id: a8b47d4a-a089-4d70-8070-e6181209bf95
:parametrized: yes
:setup: standalone instance
:steps:
1. Replace a value that is too small
2. Replace a value that is too big
3. Replace a value that overflows the int
4. Replace a value with character value (not a number)
5. Replace a vlue with a valid value
:expectedresults:
1. Value is rejected
2. Value is rejected
3. Value is rejected
4. Value is rejected
5. Value is allowed
"""

agmt = agmt_setup(topo)

# Value too small
perform_invalid_modify(agmt, attr, too_small)
# Value too big
> perform_invalid_modify(agmt, attr, too_big)

suites/replication/replica_config_test.py:255:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

o = <lib389.agreement.Agreement object at 0x7fa5f5255fa0>
attr = 'nsds5ReplicaFlowControlPause', value = '9223372036854775807'

def perform_invalid_modify(o, attr, value):
with pytest.raises(ldap.LDAPError) as ei:
> o.replace(attr, value)
E Failed: DID NOT RAISE <class 'ldap.LDAPError'>

suites/replication/replica_config_test.py:113: Failed
XFailed suites/replication/replica_config_test.py::test_agmt_num_modify[nsds5ReplicaProtocolTimeout--1-9223372036854775807-9999999999999999999999999999999999999999999999999999999999999999999-invalid-6] 0.16
topo = <lib389.topologies.TopologyMain object at 0x7fa5f5498460>
attr = 'nsds5ReplicaProtocolTimeout', too_small = '-1'
too_big = '9223372036854775807'
overflow = '9999999999999999999999999999999999999999999999999999999999999999999'
notnum = 'invalid', valid = '6'

@pytest.mark.xfail(reason="Agreement validation current does not work.")
@pytest.mark.parametrize("attr, too_small, too_big, overflow, notnum, valid", agmt_attrs)
def test_agmt_num_modify(topo, attr, too_small, too_big, overflow, notnum, valid):
"""Test all the number values you can set for a replica config entry

:id: a8b47d4a-a089-4d70-8070-e6181209bf95
:parametrized: yes
:setup: standalone instance
:steps:
1. Replace a value that is too small
2. Replace a value that is too big
3. Replace a value that overflows the int
4. Replace a value with character value (not a number)
5. Replace a vlue with a valid value
:expectedresults:
1. Value is rejected
2. Value is rejected
3. Value is rejected
4. Value is rejected
5. Value is allowed
"""

agmt = agmt_setup(topo)

# Value too small
perform_invalid_modify(agmt, attr, too_small)
# Value too big
> perform_invalid_modify(agmt, attr, too_big)

suites/replication/replica_config_test.py:255:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

o = <lib389.agreement.Agreement object at 0x7fa5f529a910>
attr = 'nsds5ReplicaProtocolTimeout', value = '9223372036854775807'

def perform_invalid_modify(o, attr, value):
with pytest.raises(ldap.LDAPError) as ei:
> o.replace(attr, value)
E Failed: DID NOT RAISE <class 'ldap.LDAPError'>

suites/replication/replica_config_test.py:113: Failed
XFailed suites/replication/ruvstore_test.py::test_memoryruv_sync_with_databaseruv::setup 0.00
request = <SubRequest 'topo' for <Function test_ruv_entry_backup>>

@pytest.fixture(scope="module")
def topology_m2(request):
"""Create Replication Deployment with two masters"""

> topology = create_topology({ReplicaRole.MASTER: 2})

/usr/local/lib/python3.8/site-packages/lib389/topologies.py:435:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/topologies.py:163: in create_topology
repl.join_master(first_master, m)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2167: in join_master
self.test_replication(from_instance, to_instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2534: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7fa5f52885e0>
from_instance = <lib389.DirSrv object at 0x7fa5f8accc10>
to_instance = <lib389.DirSrv object at 0x7fa5f5294ca0>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2518: Exception
XPassed suites/acl/syntax_test.py::test_aci_invalid_syntax_fail[test_Use_double_equal_instead_of_equal_in_the_targetattr] 0.19
No log output captured.
XPassed suites/acl/syntax_test.py::test_aci_invalid_syntax_fail[test_Use_double_equal_instead_of_equal_in_the_targetfilter] 0.03
No log output captured.
XPassed suites/filter/filterscanlimit_test.py::test_invalid_configuration 1.33
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
XPassed suites/replication/replica_config_test.py::test_agmt_num_add[nsds5ReplicaTimeout--1-9223372036854775807-9999999999999999999999999999999999999999999999999999999999999999999-invalid-6] 0.17
No log output captured.
XPassed suites/replication/replica_config_test.py::test_agmt_num_add[nsds5ReplicaBusyWaitTime--1-9223372036854775807-9999999999999999999999999999999999999999999999999999999999999999999-invalid-6] 0.18
No log output captured.
XPassed suites/replication/replica_config_test.py::test_agmt_num_add[nsds5ReplicaSessionPauseTime--1-9223372036854775807-9999999999999999999999999999999999999999999999999999999999999999999-invalid-6] 0.17
No log output captured.
XPassed suites/replication/replica_config_test.py::test_agmt_num_add[nsds5ReplicaFlowControlWindow--1-9223372036854775807-9999999999999999999999999999999999999999999999999999999999999999999-invalid-6] 0.17
No log output captured.
XPassed suites/replication/replica_config_test.py::test_agmt_num_add[nsds5ReplicaFlowControlPause--1-9223372036854775807-9999999999999999999999999999999999999999999999999999999999999999999-invalid-6] 0.17
No log output captured.
XPassed suites/replication/replica_config_test.py::test_agmt_num_add[nsds5ReplicaProtocolTimeout--1-9223372036854775807-9999999999999999999999999999999999999999999999999999999999999999999-invalid-6] 0.17
No log output captured.
Skipped suites/auth_token/basic_auth_test.py::test_ldap_auth_token_config::setup 0.00
('suites/auth_token/basic_auth_test.py', 28, 'Skipped: Auth tokens are not available in older versions')
Skipped suites/auth_token/basic_auth_test.py::test_ldap_auth_token_nsuser::setup 0.00
('suites/auth_token/basic_auth_test.py', 75, 'Skipped: Auth tokens are not available in older versions')
Skipped suites/auth_token/basic_auth_test.py::test_ldap_auth_token_disabled::setup 0.00
('suites/auth_token/basic_auth_test.py', 144, 'Skipped: Auth tokens are not available in older versions')
Skipped suites/auth_token/basic_auth_test.py::test_ldap_auth_token_directory_manager::setup 0.00
('suites/auth_token/basic_auth_test.py', 194, 'Skipped: Auth tokens are not available in older versions')
Skipped suites/auth_token/basic_auth_test.py::test_ldap_auth_token_anonymous::setup 0.00
('suites/auth_token/basic_auth_test.py', 217, 'Skipped: Auth tokens are not available in older versions')
Skipped suites/config/regression_test.py::test_set_cachememsize_to_custom_value::setup 0.00
('suites/config/regression_test.py', 34, 'Skipped: available memory is too low')
Skipped suites/dynamic_plugins/notice_for_restart_test.py::test_notice_when_dynamic_not_enabled::setup 0.00
('suites/dynamic_plugins/notice_for_restart_test.py', 24, 'Skipped: Notice not generated in older versions')
Skipped suites/entryuuid/basic_test.py::test_entryuuid_indexed_import_and_search::setup 0.00
('suites/entryuuid/basic_test.py', 73, 'Skipped: Entryuuid is not available in older versions')
Skipped suites/entryuuid/basic_test.py::test_entryuuid_unindexed_import_and_search::setup 0.00
('suites/entryuuid/basic_test.py', 113, 'Skipped: Entryuuid is not available in older versions')
Skipped suites/entryuuid/basic_test.py::test_entryuuid_generation_on_add::setup 0.00
('suites/entryuuid/basic_test.py', 155, 'Skipped: Entryuuid is not available in older versions')
Skipped suites/entryuuid/basic_test.py::test_entryuuid_fixup_task::setup 0.00
('suites/entryuuid/basic_test.py', 179, 'Skipped: Entryuuid is not available in older versions')
Skipped suites/healthcheck/health_config_test.py::test_healthcheck_logging_format_should_be_revised::setup 0.00
('suites/healthcheck/health_config_test.py', 91, 'Skipped: These tests need to use python installer')
Skipped suites/healthcheck/health_config_test.py::test_healthcheck_RI_plugin_is_misconfigured::setup 0.00
('suites/healthcheck/health_config_test.py', 134, 'Skipped: These tests need to use python installer')
Skipped suites/healthcheck/health_config_test.py::test_healthcheck_RI_plugin_missing_indexes::setup 0.00
('suites/healthcheck/health_config_test.py', 183, 'Skipped: These tests need to use python installer')
Skipped suites/healthcheck/health_config_test.py::test_healthcheck_virtual_attr_incorrectly_indexed::setup 0.00
('suites/healthcheck/health_config_test.py', 235, 'Skipped: These tests need to use python installer')
Skipped suites/healthcheck/health_config_test.py::test_healthcheck_low_disk_space::setup 0.00
('suites/healthcheck/health_config_test.py', 295, 'Skipped: These tests need to use python installer')
Skipped suites/healthcheck/health_config_test.py::test_healthcheck_notes_unindexed_search::setup 0.00
('suites/healthcheck/health_config_test.py', 340, 'Skipped: These tests need to use python installer')
Skipped suites/healthcheck/health_config_test.py::test_healthcheck_notes_unknown_attribute::setup 0.00
('suites/healthcheck/health_config_test.py', 389, 'Skipped: These tests need to use python installer')
Skipped suites/healthcheck/health_repl_test.py::test_healthcheck_replication_replica_not_reachable::setup 0.00
('suites/healthcheck/health_repl_test.py', 80, 'Skipped: These tests need to use python installer')
Skipped suites/healthcheck/health_repl_test.py::test_healthcheck_changelog_trimming_not_configured::setup 0.00
('suites/healthcheck/health_repl_test.py', 132, 'Skipped: These tests need to use python installer')
Skipped suites/healthcheck/health_repl_test.py::test_healthcheck_replication_presence_of_conflict_entries::setup 0.00
('suites/healthcheck/health_repl_test.py', 179, 'Skipped: These tests need to use python installer')
Skipped suites/healthcheck/health_repl_test.py::test_healthcheck_replication_out_of_sync_broken::setup 0.00
('suites/healthcheck/health_repl_test.py', 224, 'Skipped: These tests need to use python installer')
Skipped suites/healthcheck/health_security_test.py::test_healthcheck_insecure_pwd_hash_configured::setup 0.00
('suites/healthcheck/health_security_test.py', 86, 'Skipped: These tests can only be run with python installer and disabled ASAN')
Skipped suites/healthcheck/health_security_test.py::test_healthcheck_min_allowed_tls_version_too_low::setup 0.00
('suites/healthcheck/health_security_test.py', 135, 'Skipped: These tests can only be run with python installer and disabled ASAN')
Skipped suites/healthcheck/health_security_test.py::test_healthcheck_resolvconf_bad_file_perm::setup 0.00
('suites/healthcheck/health_security_test.py', 194, 'Skipped: These tests can only be run with python installer and disabled ASAN')
Skipped suites/healthcheck/health_security_test.py::test_healthcheck_pwdfile_bad_file_perm::setup 0.00
('suites/healthcheck/health_security_test.py', 237, 'Skipped: These tests can only be run with python installer and disabled ASAN')
Skipped suites/healthcheck/health_security_test.py::test_healthcheck_certif_expiring_within_30d::setup 0.00
('suites/healthcheck/health_security_test.py', 281, 'Skipped: These tests can only be run with python installer and disabled ASAN')
Skipped suites/healthcheck/health_security_test.py::test_healthcheck_certif_expired::setup 0.00
('suites/healthcheck/health_security_test.py', 319, 'Skipped: These tests can only be run with python installer and disabled ASAN')
Skipped suites/healthcheck/health_sync_test.py::test_healthcheck_replication_out_of_sync_not_broken::setup 0.00
('suites/healthcheck/health_sync_test.py', 70, 'Skipped: These tests need to use python installer')
Skipped suites/healthcheck/healthcheck_test.py::test_healthcheck_disabled_suffix::setup 0.00
('suites/healthcheck/healthcheck_test.py', 75, 'Skipped: These tests need to use python installer')
Skipped suites/healthcheck/healthcheck_test.py::test_healthcheck_standalone::setup 0.00
('suites/healthcheck/healthcheck_test.py', 103, 'Skipped: These tests need to use python installer')
Skipped suites/healthcheck/healthcheck_test.py::test_healthcheck_list_checks::setup 0.00
('suites/healthcheck/healthcheck_test.py', 127, 'Skipped: These tests need to use python installer')
Skipped suites/healthcheck/healthcheck_test.py::test_healthcheck_list_errors::setup 0.00
('suites/healthcheck/healthcheck_test.py', 167, 'Skipped: These tests need to use python installer')
Skipped suites/healthcheck/healthcheck_test.py::test_healthcheck_check_option::setup 0.00
('suites/healthcheck/healthcheck_test.py', 216, 'Skipped: These tests need to use python installer')
Skipped suites/healthcheck/healthcheck_test.py::test_healthcheck_standalone_tls::setup 0.00
('suites/healthcheck/healthcheck_test.py', 261, 'Skipped: These tests need to use python installer')
Skipped suites/healthcheck/healthcheck_test.py::test_healthcheck_replication::setup 0.00
('suites/healthcheck/healthcheck_test.py', 288, 'Skipped: These tests need to use python installer')
Skipped suites/healthcheck/healthcheck_test.py::test_healthcheck_replication_tls::setup 0.00
('suites/healthcheck/healthcheck_test.py', 324, 'Skipped: These tests need to use python installer')
Skipped suites/healthcheck/healthcheck_test.py::test_healthcheck_backend_missing_mapping_tree::setup 0.00
('suites/healthcheck/healthcheck_test.py', 361, 'Skipped: These tests need to use python installer')
Skipped suites/healthcheck/healthcheck_test.py::test_healthcheck_unable_to_query_backend::setup 0.00
('suites/healthcheck/healthcheck_test.py', 412, 'Skipped: These tests need to use python installer')
Skipped suites/healthcheck/healthcheck_test.py::test_healthcheck_database_not_initialized::setup 0.00
('suites/healthcheck/healthcheck_test.py', 462, 'Skipped: These tests need to use python installer')
Skipped suites/indexes/regression_test.py::test_reindex_task_creates_abandoned_index_file::setup 0.00
('suites/indexes/regression_test.py', 23, 'Skipped: Not implemented')
Skipped suites/memory_leaks/MMR_double_free_test.py::test_MMR_double_free::setup 0.00
('suites/memory_leaks/MMR_double_free_test.py', 67, "Skipped: Don't run if ASAN is not enabled")
Skipped suites/memory_leaks/range_search_test.py::test_range_search::setup 0.00
('suites/memory_leaks/range_search_test.py', 24, "Skipped: Don't run if ASAN is not enabled")
Skipped suites/migration/export_data_test.py::test_export_data_from_source_host::setup 0.00
('suites/migration/export_data_test.py', 24, 'Skipped: This test is meant to execute in specific test environment')
Skipped suites/migration/import_data_test.py::test_import_data_to_target_host::setup 0.00
('suites/migration/import_data_test.py', 24, 'Skipped: This test is meant to execute in specific test environment')
Skipped suites/replication/acceptance_test.py::test_csngen_task::setup 0.00
('suites/replication/acceptance_test.py', 504, 'Skipped: Not implemented')
Skipped suites/replication/changelog_test.py::test_cldump_files_removed::setup 0.00
('suites/replication/changelog_test.py', 235, 'Skipped: does not work for prefix builds')
Skipped suites/rewriters/adfilter_test.py::test_adfilter_objectSid::setup 0.00
('suites/rewriters/adfilter_test.py', 90, 'Skipped: It is missing samba python bindings')
Skipped suites/syncrepl_plugin/openldap_test.py::test_syncrepl_openldap::setup 0.00
('suites/syncrepl_plugin/openldap_test.py', 28, 'Skipped: Sync repl does not support openldap compat in older versions')
Skipped tickets/ticket47462_test.py::test_ticket47462::setup 0.00
('tickets/ticket47462_test.py', 39, 'Skipped: Upgrade scripts are supported only on versions < 1.4.x')
Skipped tickets/ticket47815_test.py::test_ticket47815::setup 0.00
('tickets/ticket47815_test.py', 26, 'Skipped: Not implemented, or invalid by nsMemberOf')
Skipped tickets/ticket49121_test.py::test_ticket49121::setup 0.00
('tickets/ticket49121_test.py', 32, "Skipped: Don't run if ASAN is not enabled")
Passed suites/acl/acivattr_test.py::test_positive[(ENG_USER, ENG_MANAGER, REAL_EQ_ACI)] 0.05
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/acl/acivattr_test.py::test_positive[(ENG_USER, ENG_MANAGER, REAL_PRES_ACI)] 0.06
No log output captured.
Passed suites/acl/acivattr_test.py::test_positive[(ENG_USER, ENG_MANAGER, REAL_SUB_ACI)] 0.04
No log output captured.
Passed suites/acl/acivattr_test.py::test_positive[(ENG_USER, ENG_MANAGER, ROLE_PRES_ACI)] 0.05
No log output captured.
Passed suites/acl/acivattr_test.py::test_positive[(ENG_USER, ENG_MANAGER, ROLE_SUB_ACI)] 0.04
No log output captured.
Passed suites/acl/acivattr_test.py::test_positive[(ENG_USER, ENG_MANAGER, COS_EQ_ACI)] 0.04
No log output captured.
Passed suites/acl/acivattr_test.py::test_positive[(ENG_USER, ENG_MANAGER, COS_PRES_ACI)] 0.04
No log output captured.
Passed suites/acl/acivattr_test.py::test_positive[(ENG_USER, ENG_MANAGER, COS_SUB_ACI)] 0.04
No log output captured.
Passed suites/acl/acivattr_test.py::test_positive[(ENG_USER, ENG_MANAGER, LDAPURL_ACI)] 0.04
No log output captured.
Passed suites/acl/acivattr_test.py::test_negative[(ENG_USER, SALES_MANAGER, REAL_EQ_ACI)] 0.06
No log output captured.
Passed suites/acl/acivattr_test.py::test_negative[(ENG_USER, SALES_OU, REAL_PRES_ACI)] 0.07
No log output captured.
Passed suites/acl/acivattr_test.py::test_negative[(ENG_USER, SALES_MANAGER, REAL_SUB_ACI)] 0.06
No log output captured.
Passed suites/acl/acivattr_test.py::test_negative[(ENG_USER, SALES_MANAGER, ROLE_EQ_ACI)] 0.06
No log output captured.
Passed suites/acl/acivattr_test.py::test_negative[(ENG_USER, SALES_MANAGER, ROLE_PRES_ACI)] 0.06
No log output captured.
Passed suites/acl/acivattr_test.py::test_negative[(ENG_USER, SALES_MANAGER, ROLE_SUB_ACI)] 0.05
No log output captured.
Passed suites/acl/acivattr_test.py::test_negative[(ENG_USER, SALES_MANAGER, COS_EQ_ACI)] 0.05
No log output captured.
Passed suites/acl/acivattr_test.py::test_negative[(ENG_USER, SALES_MANAGER, COS_PRES_ACI)] 0.05
No log output captured.
Passed suites/acl/acivattr_test.py::test_negative[(ENG_USER, SALES_MANAGER, COS_SUB_ACI)] 0.05
No log output captured.
Passed suites/acl/acivattr_test.py::test_negative[(SALES_UESER, SALES_MANAGER, LDAPURL_ACI)] 0.06
No log output captured.
Passed suites/acl/acivattr_test.py::test_negative[(ENG_USER, ENG_MANAGER, ROLE_EQ_ACI)] 0.05
No log output captured.
Passed suites/acl/acl_deny_test.py::test_multi_deny_aci 14.13
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389:acl_deny_test.py:39 Add uid=tuser1,ou=People,dc=example,dc=com INFO  lib389:acl_deny_test.py:50 Add uid=tuser,ou=People,dc=example,dc=com
-------------------------------Captured log call--------------------------------
INFO  lib389:acl_deny_test.py:82 Pass 1 INFO  lib389:acl_deny_test.py:85 Testing two searches behave the same... INFO  lib389:acl_deny_test.py:128 Testing search does not return any entries... INFO  lib389:acl_deny_test.py:82 Pass 2 INFO  lib389:acl_deny_test.py:85 Testing two searches behave the same... INFO  lib389:acl_deny_test.py:128 Testing search does not return any entries... INFO  lib389:acl_deny_test.py:192 Test PASSED
Passed suites/acl/deladd_test.py::test_allow_delete_access_to_groupdn 0.33
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/acl/deladd_test.py::test_allow_add_access_to_anyone 0.05
No log output captured.
Passed suites/acl/deladd_test.py::test_allow_delete_access_to_anyone 0.04
No log output captured.
Passed suites/acl/deladd_test.py::test_allow_delete_access_not_to_userdn 0.07
No log output captured.
Passed suites/acl/deladd_test.py::test_allow_delete_access_not_to_group 0.08
No log output captured.
Passed suites/acl/deladd_test.py::test_allow_add_access_to_parent 0.05
No log output captured.
Passed suites/acl/deladd_test.py::test_allow_delete_access_to_parent 0.05
No log output captured.
Passed suites/acl/deladd_test.py::test_allow_delete_access_to_dynamic_group 0.06
No log output captured.
Passed suites/acl/deladd_test.py::test_allow_delete_access_to_dynamic_group_uid 0.05
No log output captured.
Passed suites/acl/deladd_test.py::test_allow_delete_access_not_to_dynamic_group 0.32
No log output captured.
Passed suites/acl/enhanced_aci_modrnd_test.py::test_enhanced_aci_modrnd 0.26
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created. INFO  tests.suites.acl.enhanced_aci_modrnd_test:enhanced_aci_modrnd_test.py:32 Add a container: ou=test_ou_1,dc=example,dc=com INFO  tests.suites.acl.enhanced_aci_modrnd_test:enhanced_aci_modrnd_test.py:39 Add a container: ou=test_ou_2,dc=example,dc=com INFO  tests.suites.acl.enhanced_aci_modrnd_test:enhanced_aci_modrnd_test.py:46 Add a user: cn=test_user,ou=test_ou_1,dc=example,dc=com INFO  tests.suites.acl.enhanced_aci_modrnd_test:enhanced_aci_modrnd_test.py:60 Add an ACI 'allow (all)' by cn=test_user,ou=test_ou_1,dc=example,dc=com to the ou=test_ou_1,dc=example,dc=com INFO  tests.suites.acl.enhanced_aci_modrnd_test:enhanced_aci_modrnd_test.py:64 Add an ACI 'allow (all)' by cn=test_user,ou=test_ou_1,dc=example,dc=com to the ou=test_ou_2,dc=example,dc=com
-------------------------------Captured log call--------------------------------
INFO  tests.suites.acl.enhanced_aci_modrnd_test:enhanced_aci_modrnd_test.py:95 Bind as cn=test_user,ou=test_ou_1,dc=example,dc=com INFO  tests.suites.acl.enhanced_aci_modrnd_test:enhanced_aci_modrnd_test.py:99 User MODRDN operation from ou=test_ou_1,dc=example,dc=com to ou=test_ou_2,dc=example,dc=com INFO  tests.suites.acl.enhanced_aci_modrnd_test:enhanced_aci_modrnd_test.py:105 Check there is no user in ou=test_ou_1,dc=example,dc=com INFO  tests.suites.acl.enhanced_aci_modrnd_test:enhanced_aci_modrnd_test.py:111 Check there is our user in ou=test_ou_2,dc=example,dc=com
Passed suites/acl/globalgroup_part2_test.py::test_undefined_in_group_eval_five 0.07
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/acl/globalgroup_part2_test.py::test_undefined_in_group_eval_six 0.06
No log output captured.
Passed suites/acl/globalgroup_part2_test.py::test_undefined_in_group_eval_seven 0.03
No log output captured.
Passed suites/acl/globalgroup_part2_test.py::test_undefined_in_group_eval_eight 0.03
No log output captured.
Passed suites/acl/globalgroup_part2_test.py::test_undefined_in_group_eval_nine 0.03
No log output captured.
Passed suites/acl/globalgroup_part2_test.py::test_undefined_in_group_eval_ten 0.05
No log output captured.
Passed suites/acl/globalgroup_part2_test.py::test_undefined_in_group_eval_eleven 0.04
No log output captured.
Passed suites/acl/globalgroup_part2_test.py::test_undefined_in_group_eval_twelve 0.03
No log output captured.
Passed suites/acl/globalgroup_part2_test.py::test_undefined_in_group_eval_fourteen 0.06
No log output captured.
Passed suites/acl/globalgroup_part2_test.py::test_undefined_in_group_eval_fifteen 0.05
No log output captured.
Passed suites/acl/globalgroup_part2_test.py::test_undefined_in_group_eval_sixteen 0.03
No log output captured.
Passed suites/acl/globalgroup_part2_test.py::test_undefined_in_group_eval_seventeen 0.03
No log output captured.
Passed suites/acl/globalgroup_part2_test.py::test_undefined_in_group_eval_eighteen 0.28
No log output captured.
Passed suites/acl/globalgroup_test.py::test_caching_changes 0.06
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/acl/globalgroup_test.py::test_deny_group_member_all_rights_to_user 0.06
No log output captured.
Passed suites/acl/globalgroup_test.py::test_deny_group_member_all_rights_to_group_members 0.03
No log output captured.
Passed suites/acl/globalgroup_test.py::test_deeply_nested_groups_aci_denial 0.05
No log output captured.
Passed suites/acl/globalgroup_test.py::test_deeply_nested_groups_aci_denial_two 0.03
No log output captured.
Passed suites/acl/globalgroup_test.py::test_deeply_nested_groups_aci_allow 0.02
No log output captured.
Passed suites/acl/globalgroup_test.py::test_deeply_nested_groups_aci_allow_two 0.03
No log output captured.
Passed suites/acl/globalgroup_test.py::test_undefined_in_group_eval 0.03
No log output captured.
Passed suites/acl/globalgroup_test.py::test_undefined_in_group_eval_two 0.02
No log output captured.
Passed suites/acl/globalgroup_test.py::test_undefined_in_group_eval_three 0.02
No log output captured.
Passed suites/acl/globalgroup_test.py::test_undefined_in_group_eval_four 0.05
No log output captured.
Passed suites/acl/keywords_part2_test.py::test_access_from_certain_network_only_ip 5.48
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/acl/keywords_part2_test.py::test_connectin_from_an_unauthorized_network 0.12
No log output captured.
Passed suites/acl/keywords_part2_test.py::test_ip_keyword_test_noip_cannot 0.09
No log output captured.
Passed suites/acl/keywords_part2_test.py::test_user_can_access_the_data_at_any_time 0.07
No log output captured.
Passed suites/acl/keywords_part2_test.py::test_user_can_access_the_data_only_in_the_morning 0.09
No log output captured.
Passed suites/acl/keywords_part2_test.py::test_user_can_access_the_data_only_in_the_afternoon 0.08
No log output captured.
Passed suites/acl/keywords_part2_test.py::test_timeofday_keyword 1.13
No log output captured.
Passed suites/acl/keywords_part2_test.py::test_dayofweek_keyword_test_everyday_can_access 0.08
No log output captured.
Passed suites/acl/keywords_part2_test.py::test_dayofweek_keyword_today_can_access 0.08
No log output captured.
Passed suites/acl/keywords_part2_test.py::test_user_cannot_access_the_data_at_all 0.08
No log output captured.
Passed suites/acl/keywords_test.py::test_user_binds_with_a_password_and_can_access_the_data 0.05
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/acl/keywords_test.py::test_user_binds_with_a_bad_password_and_cannot_access_the_data 0.01
No log output captured.
Passed suites/acl/keywords_test.py::test_anonymous_user_cannot_access_the_data 0.04
No log output captured.
Passed suites/acl/keywords_test.py::test_authenticated_but_has_no_rigth_on_the_data 0.05
No log output captured.
Passed suites/acl/keywords_test.py::test_the_bind_client_is_accessing_the_directory 0.02
No log output captured.
Passed suites/acl/keywords_test.py::test_users_binds_with_a_password_and_can_access_the_data 0.02
No log output captured.
Passed suites/acl/keywords_test.py::test_user_binds_without_any_password_and_cannot_access_the_data 0.03
No log output captured.
Passed suites/acl/keywords_test.py::test_user_can_access_the_data_when_connecting_from_any_machine 0.04
No log output captured.
Passed suites/acl/keywords_test.py::test_user_can_access_the_data_when_connecting_from_internal_ds_network_only 0.04
No log output captured.
Passed suites/acl/keywords_test.py::test_user_can_access_the_data_when_connecting_from_some_network_only 0.05
No log output captured.
Passed suites/acl/keywords_test.py::test_from_an_unauthorized_network 0.04
No log output captured.
Passed suites/acl/keywords_test.py::test_user_cannot_access_the_data_when_connecting_from_an_unauthorized_network_2 0.04
No log output captured.
Passed suites/acl/keywords_test.py::test_user_cannot_access_the_data_if_not_from_a_certain_domain 0.06
No log output captured.
Passed suites/acl/keywords_test.py::test_dnsalias_keyword_test_nodns_cannot 0.24
No log output captured.
Passed suites/acl/keywords_test.py::test_user_can_access_from_ipv4_or_ipv6_address[127.0.0.1] 0.04
No log output captured.
Passed suites/acl/keywords_test.py::test_user_can_access_from_ipv4_or_ipv6_address[[::1]] 0.02
No log output captured.
Passed suites/acl/misc_test.py::test_accept_aci_in_addition_to_acl 0.60
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/acl/misc_test.py::test_more_then_40_acl_will_crash_slapd 0.53
-------------------------------Captured log setup-------------------------------
INFO  lib389:misc_test.py:65 Exception (expected): ALREADY_EXISTS
Passed suites/acl/misc_test.py::test_search_access_should_not_include_read_access 0.01
-------------------------------Captured log setup-------------------------------
INFO  lib389:misc_test.py:65 Exception (expected): ALREADY_EXISTS
Passed suites/acl/misc_test.py::test_only_allow_some_targetattr 0.31
-------------------------------Captured log setup-------------------------------
INFO  lib389:misc_test.py:65 Exception (expected): ALREADY_EXISTS
Passed suites/acl/misc_test.py::test_only_allow_some_targetattr_two 0.34
-------------------------------Captured log setup-------------------------------
INFO  lib389:misc_test.py:65 Exception (expected): ALREADY_EXISTS
Passed suites/acl/misc_test.py::test_memberurl_needs_to_be_normalized 0.13
-------------------------------Captured log setup-------------------------------
INFO  lib389:misc_test.py:65 Exception (expected): ALREADY_EXISTS
Passed suites/acl/misc_test.py::test_greater_than_200_acls_can_be_created 3.80
-------------------------------Captured log setup-------------------------------
INFO  lib389:misc_test.py:65 Exception (expected): ALREADY_EXISTS
Passed suites/acl/misc_test.py::test_server_bahaves_properly_with_very_long_attribute_names 0.06
-------------------------------Captured log setup-------------------------------
INFO  lib389:misc_test.py:65 Exception (expected): ALREADY_EXISTS
Passed suites/acl/misc_test.py::test_do_bind_as_201_distinct_users 143.83
-------------------------------Captured log setup-------------------------------
INFO  lib389:misc_test.py:65 Exception (expected): ALREADY_EXISTS
Passed suites/acl/modify_test.py::test_allow_write_access_to_targetattr_with_a_single_attribute 0.32
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/acl/modify_test.py::test_allow_write_access_to_targetattr_with_multiple_attibutes 0.06
No log output captured.
Passed suites/acl/modify_test.py::test_allow_write_access_to_userdn_all 0.13
No log output captured.
Passed suites/acl/modify_test.py::test_allow_write_access_to_userdn_with_wildcards_in_dn 0.05
No log output captured.
Passed suites/acl/modify_test.py::test_allow_write_access_to_userdn_with_multiple_dns 0.44
No log output captured.
Passed suites/acl/modify_test.py::test_allow_write_access_to_target_with_wildcards 0.19
No log output captured.
Passed suites/acl/modify_test.py::test_allow_write_access_to_userdnattr 0.09
No log output captured.
Passed suites/acl/modify_test.py::test_allow_selfwrite_access_to_anyone 0.08
No log output captured.
Passed suites/acl/modify_test.py::test_uniquemember_should_also_be_the_owner 0.22
No log output captured.
Passed suites/acl/modify_test.py::test_aci_with_both_allow_and_deny 0.17
No log output captured.
Passed suites/acl/modify_test.py::test_allow_owner_to_modify_entry 0.12
No log output captured.
Passed suites/acl/modrdn_test.py::test_allow_write_privilege_to_anyone 0.02
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/acl/modrdn_test.py::test_allow_write_privilege_to_dynamic_group_with_scope_set_to_base_in_ldap_url 0.03
No log output captured.
Passed suites/acl/modrdn_test.py::test_write_access_to_naming_atributes 0.04
No log output captured.
Passed suites/acl/modrdn_test.py::test_write_access_to_naming_atributes_two 0.10
No log output captured.
Passed suites/acl/modrdn_test.py::test_access_aci_list_contains_any_deny_rule 0.13
No log output captured.
Passed suites/acl/modrdn_test.py::test_renaming_target_entry 0.09
No log output captured.
Passed suites/acl/repeated_ldap_add_test.py::test_repeated_ldap_add 33.23
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
------------------------------Captured stdout call------------------------------
uid=buser123,ou=BOU,dc=example,dc=com inactivated.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:184 Testing Bug 1347760 - Information disclosure via repeated use of LDAP ADD operation, etc. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:186 Disabling accesslog logbuffering INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:189 Bind as {cn=Directory Manager,password} INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:192 Adding ou=BOU a bind user belongs to. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:197 Adding a bind user. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:204 Adding a test user. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:211 Deleting aci in dc=example,dc=com. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:214 While binding as DM, acquire an access log path and instance dir INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:220 Bind case 1. the bind user has no rights to read the entry itself, bind should be successful. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:221 Bind as {uid=buser123,ou=BOU,dc=example,dc=com,buser123} who has no access rights. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:229 Access log path: /var/log/dirsrv/slapd-standalone1/access INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:231 Bind case 2-1. the bind user does not exist, bind should fail with error INVALID_CREDENTIALS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:233 Bind as {uid=bogus,dc=example,dc=com,bogus} who does not exist. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:237 Exception (expected): INVALID_CREDENTIALS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:238 Desc Invalid credentials INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:246 Cause found - [20/Oct/2020:19:16:18.000127791 -0400] conn=1 op=11 RESULT err=49 tag=97 nentries=0 wtime=0.000179549 optime=0.007071888 etime=0.007243040 - No such entry INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:249 Bind case 2-2. the bind user's suffix does not exist, bind should fail with error INVALID_CREDENTIALS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:251 Bind as {uid=bogus,ou=people,dc=bogus,bogus} who does not exist. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:260 Cause found - [20/Oct/2020:19:16:19.008815983 -0400] conn=1 op=12 RESULT err=49 tag=97 nentries=0 wtime=0.000214398 optime=0.004316016 etime=0.004524042 - No suffix for bind dn found INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:263 Bind case 2-3. the bind user's password is wrong, bind should fail with error INVALID_CREDENTIALS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:265 Bind as {uid=buser123,ou=BOU,dc=example,dc=com,bogus} who does not exist. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:269 Exception (expected): INVALID_CREDENTIALS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:270 Desc Invalid credentials INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:278 Cause found - [20/Oct/2020:19:16:20.042274449 -0400] conn=1 op=13 RESULT err=49 tag=97 nentries=0 wtime=0.000172257 optime=0.029419181 etime=0.029584525 - Invalid credentials INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:281 Adding aci for uid=buser123,ou=BOU,dc=example,dc=com to ou=BOU,dc=example,dc=com. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:283 aci: (targetattr="*")(version 3.0; acl "buser123"; allow(all) userdn = "ldap:///uid=buser123,ou=BOU,dc=example,dc=com";) INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:284 Bind as {cn=Directory Manager,password} INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:289 Bind case 3. the bind user has the right to read the entry itself, bind should be successful. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:290 Bind as {uid=buser123,ou=BOU,dc=example,dc=com,buser123} which should be ok. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:293 The following operations are against the subtree the bind user uid=buser123,ou=BOU,dc=example,dc=com has no rights. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:297 Search case 1. the bind user has no rights to read the search entry, it should return no search results with <class 'ldap.SUCCESS'> INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:108 Searching existing entry uid=tuser0,ou=people,dc=example,dc=com, which should be ok. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:141 Search should return none INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:148 PASSED INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:303 Search case 2-1. the search entry does not exist, the search should return no search results with SUCCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:108 Searching non-existing entry uid=bogus,dc=example,dc=com, which should be ok. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:141 Search should return none INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:148 PASSED INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:309 Search case 2-2. the search entry does not exist, the search should return no search results with SUCCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:108 Searching non-existing entry uid=bogus,ou=people,dc=example,dc=com, which should be ok. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:141 Search should return none INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:148 PASSED INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:316 Add case 1. the bind user has no rights AND the adding entry exists, it should fail with INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:108 Adding existing entry uid=tuser0,ou=people,dc=example,dc=com, which should fail with INSUFFICIENT_ACCESS. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:131 Exception (expected): INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:132 Desc Insufficient access INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:148 PASSED INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:322 Add case 2-1. the bind user has no rights AND the adding entry does not exist, it should fail with INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:108 Adding non-existing entry uid=bogus,dc=example,dc=com, which should fail with INSUFFICIENT_ACCESS. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:131 Exception (expected): INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:132 Desc Insufficient access INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:148 PASSED INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:328 Add case 2-2. the bind user has no rights AND the adding entry does not exist, it should fail with INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:108 Adding non-existing entry uid=bogus,ou=people,dc=example,dc=com, which should fail with INSUFFICIENT_ACCESS. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:131 Exception (expected): INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:132 Desc Insufficient access INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:148 PASSED INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:335 Modify case 1. the bind user has no rights AND the modifying entry exists, it should fail with INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:108 Modifying existing entry uid=tuser0,ou=people,dc=example,dc=com, which should fail with INSUFFICIENT_ACCESS. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:131 Exception (expected): INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:132 Desc Insufficient access INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:148 PASSED INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:341 Modify case 2-1. the bind user has no rights AND the modifying entry does not exist, it should fail with INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:108 Modifying non-existing entry uid=bogus,dc=example,dc=com, which should fail with INSUFFICIENT_ACCESS. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:131 Exception (expected): INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:132 Desc Insufficient access INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:148 PASSED INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:347 Modify case 2-2. the bind user has no rights AND the modifying entry does not exist, it should fail with INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:108 Modifying non-existing entry uid=bogus,ou=people,dc=example,dc=com, which should fail with INSUFFICIENT_ACCESS. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:131 Exception (expected): INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:132 Desc Insufficient access INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:148 PASSED INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:354 Modrdn case 1. the bind user has no rights AND the renaming entry exists, it should fail with INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:108 Renaming existing entry uid=tuser0,ou=people,dc=example,dc=com, which should fail with INSUFFICIENT_ACCESS. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:131 Exception (expected): INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:132 Desc Insufficient access INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:148 PASSED INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:360 Modrdn case 2-1. the bind user has no rights AND the renaming entry does not exist, it should fail with INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:108 Renaming non-existing entry uid=bogus,dc=example,dc=com, which should fail with INSUFFICIENT_ACCESS. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:131 Exception (expected): INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:132 Desc Insufficient access INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:148 PASSED INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:366 Modrdn case 2-2. the bind user has no rights AND the renaming entry does not exist, it should fail with INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:108 Renaming non-existing entry uid=bogus,ou=people,dc=example,dc=com, which should fail with INSUFFICIENT_ACCESS. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:131 Exception (expected): INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:132 Desc Insufficient access INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:148 PASSED INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:372 Modrdn case 3. the bind user has no rights AND the node moving an entry to exists, it should fail with INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:108 Moving to existing superior ou=groups,dc=example,dc=com, which should fail with INSUFFICIENT_ACCESS. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:131 Exception (expected): INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:132 Desc Insufficient access INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:148 PASSED INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:378 Modrdn case 4-1. the bind user has no rights AND the node moving an entry to does not, it should fail with INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:108 Moving to non-existing superior ou=OU,dc=example,dc=com, which should fail with INSUFFICIENT_ACCESS. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:131 Exception (expected): INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:132 Desc Insufficient access INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:148 PASSED INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:384 Modrdn case 4-2. the bind user has no rights AND the node moving an entry to does not, it should fail with INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:108 Moving to non-existing superior ou=OU,dc=example,dc=com, which should fail with INSUFFICIENT_ACCESS. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:131 Exception (expected): INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:132 Desc Insufficient access INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:148 PASSED INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:391 Delete case 1. the bind user has no rights AND the deleting entry exists, it should fail with INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:108 Deleting existing entry uid=tuser0,ou=people,dc=example,dc=com, which should fail with INSUFFICIENT_ACCESS. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:131 Exception (expected): INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:132 Desc Insufficient access INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:148 PASSED INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:397 Delete case 2-1. the bind user has no rights AND the deleting entry does not exist, it should fail with INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:108 Deleting non-existing entry uid=bogus,dc=example,dc=com, which should fail with INSUFFICIENT_ACCESS. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:131 Exception (expected): INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:132 Desc Insufficient access INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:148 PASSED INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:403 Delete case 2-2. the bind user has no rights AND the deleting entry does not exist, it should fail with INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:108 Deleting non-existing entry uid=bogus,ou=people,dc=example,dc=com, which should fail with INSUFFICIENT_ACCESS. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:131 Exception (expected): INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:132 Desc Insufficient access INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:148 PASSED INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:407 EXTRA: Check no regressions INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:408 Adding aci for uid=buser123,ou=BOU,dc=example,dc=com to dc=example,dc=com. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:410 Bind as {cn=Directory Manager,password} INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:415 Bind as {uid=buser123,ou=BOU,dc=example,dc=com,buser123}. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:425 Search case. the search entry does not exist, the search should fail with NO_SUCH_OBJECT INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:108 Searching non-existing entry uid=bogus,ou=people,dc=example,dc=com, which should fail with NO_SUCH_OBJECT. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:131 Exception (expected): NO_SUCH_OBJECT INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:132 Desc No such object INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:148 PASSED INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:431 Add case. the adding entry already exists, it should fail with ALREADY_EXISTS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:108 Adding existing entry uid=tuser0,ou=people,dc=example,dc=com, which should fail with ALREADY_EXISTS. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:131 Exception (expected): ALREADY_EXISTS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:132 Desc Already exists INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:148 PASSED INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:436 Modify case. the modifying entry does not exist, it should fail with NO_SUCH_OBJECT INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:108 Modifying non-existing entry uid=bogus,dc=example,dc=com, which should fail with NO_SUCH_OBJECT. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:131 Exception (expected): NO_SUCH_OBJECT INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:132 Desc No such object INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:148 PASSED INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:441 Modrdn case 1. the renaming entry does not exist, it should fail with NO_SUCH_OBJECT INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:108 Renaming non-existing entry uid=bogus,dc=example,dc=com, which should fail with NO_SUCH_OBJECT. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:131 Exception (expected): NO_SUCH_OBJECT INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:132 Desc No such object INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:148 PASSED INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:446 Modrdn case 2. the node moving an entry to does not, it should fail with NO_SUCH_OBJECT INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:108 Moving to non-existing superior ou=OU,dc=example,dc=com, which should fail with NO_SUCH_OBJECT. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:131 Exception (expected): NO_SUCH_OBJECT INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:132 Desc No such object INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:148 PASSED INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:451 Delete case. the deleting entry does not exist, it should fail with NO_SUCH_OBJECT INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:108 Deleting non-existing entry uid=bogus,dc=example,dc=com, which should fail with NO_SUCH_OBJECT. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:131 Exception (expected): NO_SUCH_OBJECT INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:132 Desc No such object INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:148 PASSED INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:454 Inactivate uid=buser123,ou=BOU,dc=example,dc=com INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:461 ['/usr/sbin/ns-inactivate.pl', '-Z', 'standalone1', '-D', 'cn=Directory Manager', '-w', 'password', '-I', 'uid=buser123,ou=BOU,dc=example,dc=com'] INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:465 Bind as {uid=buser123,ou=BOU,dc=example,dc=com,buser123} which should fail with UNWILLING_TO_PERFORM. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:469 Exception (expected): UNWILLING_TO_PERFORM INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:470 Desc Server is unwilling to perform INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:473 Bind as {uid=buser123,ou=BOU,dc=example,dc=com,bogus} which should fail with UNWILLING_TO_PERFORM. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:477 Exception (expected): UNWILLING_TO_PERFORM INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:478 Desc Server is unwilling to perform INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:481 SUCCESS
Passed suites/acl/roledn_test.py::test_mod_seealso_positive[(STEVE_ROLE, NESTED_ROLE_TESTER)] 0.05
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/acl/roledn_test.py::test_mod_seealso_positive[(HARRY_ROLE, NESTED_ROLE_TESTER)] 0.04
No log output captured.
Passed suites/acl/roledn_test.py::test_mod_seealso_positive[(MARY_ROLE, NOT_RULE_ACCESS)] 0.04
No log output captured.
Passed suites/acl/roledn_test.py::test_mod_seealso_positive[(STEVE_ROLE, OR_RULE_ACCESS)] 0.04
No log output captured.
Passed suites/acl/roledn_test.py::test_mod_seealso_positive[(HARRY_ROLE, OR_RULE_ACCESS)] 0.04
No log output captured.
Passed suites/acl/roledn_test.py::test_mod_seealso_positive[(STEVE_ROLE, ALL_ACCESS)] 0.04
No log output captured.
Passed suites/acl/roledn_test.py::test_mod_seealso_positive[(HARRY_ROLE, ALL_ACCESS)] 0.04
No log output captured.
Passed suites/acl/roledn_test.py::test_mod_seealso_positive[(MARY_ROLE, ALL_ACCESS)] 0.04
No log output captured.
Passed suites/acl/roledn_test.py::test_mod_seealso_negative[(MARY_ROLE, NESTED_ROLE_TESTER)] 0.04
No log output captured.
Passed suites/acl/roledn_test.py::test_mod_seealso_negative[(STEVE_ROLE, NOT_RULE_ACCESS)] 0.06
No log output captured.
Passed suites/acl/roledn_test.py::test_mod_seealso_negative[(HARRY_ROLE, NOT_RULE_ACCESS)] 0.06
No log output captured.
Passed suites/acl/roledn_test.py::test_mod_seealso_negative[(MARY_ROLE , OR_RULE_ACCESS)] 0.05
No log output captured.
Passed suites/acl/roledn_test.py::test_mod_anonseealso_positive[NOT_RULE_ACCESS] 0.01
No log output captured.
Passed suites/acl/roledn_test.py::test_mod_anonseealso_positive[ALL_ACCESS] 0.01
No log output captured.
Passed suites/acl/roledn_test.py::test_mod_anonseealso_negaive[NESTED_ROLE_TESTER] 0.02
No log output captured.
Passed suites/acl/roledn_test.py::test_mod_anonseealso_negaive[OR_RULE_ACCESS] 0.02
No log output captured.
Passed suites/acl/search_real_part2_test.py::test_deny_all_access_with__target_set_on_non_leaf 0.97
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/acl/search_real_part2_test.py::test_deny_all_access_with__target_set_on_wildcard_non_leaf 0.97
No log output captured.
Passed suites/acl/search_real_part2_test.py::test_deny_all_access_with__target_set_on_wildcard_leaf 1.01
No log output captured.
Passed suites/acl/search_real_part2_test.py::test_deny_all_access_with_targetfilter_using_equality_search 0.75
No log output captured.
Passed suites/acl/search_real_part2_test.py::test_deny_all_access_with_targetfilter_using_equality_search_two 0.92
No log output captured.
Passed suites/acl/search_real_part2_test.py::test_deny_all_access_with_targetfilter_using_substring_search 0.78
No log output captured.
Passed suites/acl/search_real_part2_test.py::test_deny_all_access_with_targetfilter_using_substring_search_two 0.93
No log output captured.
Passed suites/acl/search_real_part2_test.py::test_deny_all_access_with_targetfilter_using_boolean_or_of_two_equality_search 0.16
No log output captured.
Passed suites/acl/search_real_part2_test.py::test_deny_all_access_to__userdn_two 0.82
No log output captured.
Passed suites/acl/search_real_part2_test.py::test_deny_all_access_with_userdn 0.76
No log output captured.
Passed suites/acl/search_real_part2_test.py::test_deny_all_access_with_targetfilter_using_presence_search 0.14
No log output captured.
Passed suites/acl/search_real_part3_test.py::test_deny_search_access_to_userdn_with_ldap_url 1.03
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/acl/search_real_part3_test.py::test_deny_search_access_to_userdn_with_ldap_url_two 0.85
No log output captured.
Passed suites/acl/search_real_part3_test.py::test_deny_search_access_to_userdn_with_ldap_url_matching_all_users 0.97
No log output captured.
Passed suites/acl/search_real_part3_test.py::test_deny_read_access_to_a_dynamic_group 0.63
No log output captured.
Passed suites/acl/search_real_part3_test.py::test_deny_read_access_to_dynamic_group_with_host_port_set_on_ldap_url 0.54
No log output captured.
Passed suites/acl/search_real_part3_test.py::test_deny_read_access_to_dynamic_group_with_scope_set_to_one_in_ldap_url 0.54
No log output captured.
Passed suites/acl/search_real_part3_test.py::test_deny_read_access_to_dynamic_group_two 0.76
No log output captured.
Passed suites/acl/search_real_part3_test.py::test_deny_access_to_group_should_deny_access_to_all_uniquemember 0.85
No log output captured.
Passed suites/acl/search_real_part3_test.py::test_entry_with_lots_100_attributes 8.21
No log output captured.
Passed suites/acl/search_real_part3_test.py::test_groupdnattr_value_is_another_group 0.13
No log output captured.
Passed suites/acl/search_real_test.py::test_deny_all_access_with_target_set 1.20
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/acl/search_real_test.py::test_deny_all_access_to_a_target_with_wild_card 0.74
No log output captured.
Passed suites/acl/search_real_test.py::test_deny_all_access_without_a_target_set 0.92
No log output captured.
Passed suites/acl/search_real_test.py::test_deny_read_search_and_compare_access_with_target_and_targetattr_set 0.72
No log output captured.
Passed suites/acl/search_real_test.py::test_deny_read_access_to_multiple_groupdns 0.88
No log output captured.
Passed suites/acl/search_real_test.py::test_deny_all_access_to_userdnattr 0.68
No log output captured.
Passed suites/acl/search_real_test.py::test_deny_all_access_with__target_set 0.79
No log output captured.
Passed suites/acl/search_real_test.py::test_deny_all_access_with__targetattr_set 1.36
No log output captured.
Passed suites/acl/search_real_test.py::test_deny_all_access_with_targetattr_set 0.64
No log output captured.
Passed suites/acl/selfdn_permissions_test.py::test_selfdn_permission_add 0.71
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389:selfdn_permissions_test.py:58 Add OCticket47653 that allows 'member' attribute INFO  lib389:selfdn_permissions_test.py:63 Add cn=bind_entry, dc=example,dc=com
-------------------------------Captured log call--------------------------------
INFO  lib389:selfdn_permissions_test.py:106 ######################### ADD ###################### INFO  lib389:selfdn_permissions_test.py:109 Bind as cn=bind_entry, dc=example,dc=com INFO  lib389:selfdn_permissions_test.py:139 Try to add Add cn=test_entry, dc=example,dc=com (aci is missing): dn: cn=test_entry, dc=example,dc=com cn: test_entry member: cn=bind_entry, dc=example,dc=com objectclass: top objectclass: person objectclass: OCticket47653 postalAddress: here postalCode: 1234 sn: test_entry INFO  lib389:selfdn_permissions_test.py:143 Exception (expected): INSUFFICIENT_ACCESS INFO  lib389:selfdn_permissions_test.py:147 Bind as cn=Directory Manager and add the ADD SELFDN aci INFO  lib389:selfdn_permissions_test.py:159 Bind as cn=bind_entry, dc=example,dc=com INFO  lib389:selfdn_permissions_test.py:164 Try to add Add cn=test_entry, dc=example,dc=com (member is missing) INFO  lib389:selfdn_permissions_test.py:172 Exception (expected): INSUFFICIENT_ACCESS INFO  lib389:selfdn_permissions_test.py:178 Try to add Add cn=test_entry, dc=example,dc=com (with several member values) INFO  lib389:selfdn_permissions_test.py:181 Exception (expected): INSUFFICIENT_ACCESS INFO  lib389:selfdn_permissions_test.py:184 Try to add Add cn=test_entry, dc=example,dc=com should be successful
Passed suites/acl/selfdn_permissions_test.py::test_selfdn_permission_search 0.63
-------------------------------Captured log call--------------------------------
INFO  lib389:selfdn_permissions_test.py:205 ######################### SEARCH ###################### INFO  lib389:selfdn_permissions_test.py:207 Bind as cn=bind_entry, dc=example,dc=com INFO  lib389:selfdn_permissions_test.py:211 Try to search cn=test_entry, dc=example,dc=com (aci is missing) INFO  lib389:selfdn_permissions_test.py:216 Bind as cn=Directory Manager and add the READ/SEARCH SELFDN aci INFO  lib389:selfdn_permissions_test.py:229 Bind as cn=bind_entry, dc=example,dc=com INFO  lib389:selfdn_permissions_test.py:233 Try to search cn=test_entry, dc=example,dc=com should be successful
Passed suites/acl/selfdn_permissions_test.py::test_selfdn_permission_modify 0.63
-------------------------------Captured log call--------------------------------
INFO  lib389:selfdn_permissions_test.py:256 Bind as cn=bind_entry, dc=example,dc=com INFO  lib389:selfdn_permissions_test.py:259 ######################### MODIFY ###################### INFO  lib389:selfdn_permissions_test.py:263 Try to modify cn=test_entry, dc=example,dc=com (aci is missing) INFO  lib389:selfdn_permissions_test.py:267 Exception (expected): INSUFFICIENT_ACCESS INFO  lib389:selfdn_permissions_test.py:271 Bind as cn=Directory Manager and add the WRITE SELFDN aci INFO  lib389:selfdn_permissions_test.py:284 Bind as cn=bind_entry, dc=example,dc=com INFO  lib389:selfdn_permissions_test.py:288 Try to modify cn=test_entry, dc=example,dc=com. It should succeeds
Passed suites/acl/selfdn_permissions_test.py::test_selfdn_permission_delete 0.27
-------------------------------Captured log call--------------------------------
INFO  lib389:selfdn_permissions_test.py:314 ######################### DELETE ###################### INFO  lib389:selfdn_permissions_test.py:317 Bind as cn=bind_entry, dc=example,dc=com INFO  lib389:selfdn_permissions_test.py:322 Try to delete cn=test_entry, dc=example,dc=com (aci is missing) INFO  lib389:selfdn_permissions_test.py:325 Exception (expected): INSUFFICIENT_ACCESS INFO  lib389:selfdn_permissions_test.py:329 Bind as cn=Directory Manager and add the READ/SEARCH SELFDN aci INFO  lib389:selfdn_permissions_test.py:341 Bind as cn=bind_entry, dc=example,dc=com INFO  lib389:selfdn_permissions_test.py:345 Try to delete cn=test_entry, dc=example,dc=com should be successful
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_targattrfilters_1] 0.16
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_targattrfilters_2] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_targattrfilters_3] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_targattrfilters_4] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_targattrfilters_5] 0.16
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_targattrfilters_6] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_targattrfilters_7] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_targattrfilters_8] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_targattrfilters_9] 0.03
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_targattrfilters_10] 0.17
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_targattrfilters_11] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_targattrfilters_12] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_targattrfilters_13] 0.03
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_targattrfilters_14] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_targattrfilters_15] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_targattrfilters_16] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_targattrfilters_17] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_targattrfilters_19] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_targattrfilters_21] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_targattrfilters_22] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_targattrfilters_23] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_Missing_acl_mispel] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_Missing_acl_string] 0.03
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_Wrong_version_string] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_Missing_version_string] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_Authenticate_statement] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_Multiple_targets] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_Target_set_to_self] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_target_set_with_ldap_instead_of_ldap] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_target_set_with_more_than_three] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_target_set_with_less_than_three] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_bind_rule_set_with_less_than_three] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_Use_semicolon_instead_of_comma_in_permission] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_Use_double_equal_instead_of_equal_in_the_target] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_use_double_equal_instead_of_equal_in_user_and_group_access] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_donot_cote_the_name_of_the_aci] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_extra_parentheses_case_1] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_extra_parentheses_case_2] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_extra_parentheses_case_3] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_no_semicolon_at_the_end_of_the_aci] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_a_character_different_of_a_semicolon_at_the_end_of_the_aci] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_bad_filter] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_Use_double_equal_instead_of_equal_in_the_targattrfilters] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_Use_double_equal_instead_of_equal_inside_the_targattrfilters] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_target_set_above_the_entry_test 0.02
No log output captured.
Passed suites/acl/userattr_test.py::test_mod_see_also_positive[(CAN,ROLEDNACCESS)] 0.04
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/acl/userattr_test.py::test_mod_see_also_positive[(CAN,USERDNACCESS)] 0.02
No log output captured.
Passed suites/acl/userattr_test.py::test_mod_see_also_positive[(CAN,GROUPDNACCESS)] 0.02
No log output captured.
Passed suites/acl/userattr_test.py::test_mod_see_also_positive[(CAN,LDAPURLACCESS)] 0.01
No log output captured.
Passed suites/acl/userattr_test.py::test_mod_see_also_positive[(CAN,ATTRNAMEACCESS)] 0.02
No log output captured.
Passed suites/acl/userattr_test.py::test_mod_see_also_positive[(LEVEL_0, OU_2)] 0.05
No log output captured.
Passed suites/acl/userattr_test.py::test_mod_see_also_positive[(LEVEL_1,ANCESTORS)] 0.04
No log output captured.
Passed suites/acl/userattr_test.py::test_mod_see_also_positive[(LEVEL_2,GRANDPARENTS)] 0.04
No log output captured.
Passed suites/acl/userattr_test.py::test_mod_see_also_positive[(LEVEL_4,OU_2)] 0.04
No log output captured.
Passed suites/acl/userattr_test.py::test_mod_see_also_positive[(LEVEL_4, ANCESTORS)] 0.02
No log output captured.
Passed suites/acl/userattr_test.py::test_mod_see_also_positive[(LEVEL_4,GRANDPARENTS)] 0.01
No log output captured.
Passed suites/acl/userattr_test.py::test_mod_see_also_positive[(LEVEL_4,PARENTS)] 0.02
No log output captured.
Passed suites/acl/userattr_test.py::test_mod_see_also_positive[(LEVEL_4,CHILDREN)] 0.02
No log output captured.
Passed suites/acl/userattr_test.py::test_mod_see_also_negative[(CANNOT,ROLEDNACCESS)] 0.31
No log output captured.
Passed suites/acl/userattr_test.py::test_mod_see_also_negative[(CANNOT,USERDNACCESS)] 0.03
No log output captured.
Passed suites/acl/userattr_test.py::test_mod_see_also_negative[(CANNOT,GROUPDNACCESS)] 0.03
No log output captured.
Passed suites/acl/userattr_test.py::test_mod_see_also_negative[(CANNOT,LDAPURLACCESS)] 0.03
No log output captured.
Passed suites/acl/userattr_test.py::test_mod_see_also_negative[(CANNOT,ATTRNAMEACCESS)] 0.03
No log output captured.
Passed suites/acl/userattr_test.py::test_mod_see_also_negative[(LEVEL_0, ANCESTORS)] 0.05
No log output captured.
Passed suites/acl/userattr_test.py::test_mod_see_also_negative[(LEVEL_0,GRANDPARENTS)] 0.03
No log output captured.
Passed suites/acl/userattr_test.py::test_mod_see_also_negative[(LEVEL_0,PARENTS)] 0.03
No log output captured.
Passed suites/acl/userattr_test.py::test_mod_see_also_negative[(LEVEL_0,CHILDREN)] 0.03
No log output captured.
Passed suites/acl/userattr_test.py::test_mod_see_also_negative[(LEVEL_2,PARENTS)] 0.30
No log output captured.
Passed suites/acl/userattr_test.py::test_mod_see_also_negative[(LEVEL_4,GRANDSONS)] 0.05
No log output captured.
Passed suites/acl/userattr_test.py::test_last_three[uid=Ananda Borah,ou=Accounting,dc=example,dc=com-uid=USERDNACCESS,ou=Accounting,dc=example,dc=com] 0.05
No log output captured.
Passed suites/acl/userattr_test.py::test_last_three[uid=Ananda Borah,ou=Accounting,dc=example,dc=com-uid=ROLEDNACCESS,ou=Accounting,dc=example,dc=com] 0.03
No log output captured.
Passed suites/acl/userattr_test.py::test_last_three[uid=Ananda Borah,ou=Accounting,dc=example,dc=com-uid=GROUPDNACCESS,ou=Accounting,dc=example,dc=com] 0.03
No log output captured.
Passed suites/acl/valueacl_part2_test.py::test_we_can_search_as_expected 0.01
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/acl/valueacl_part2_test.py::test_we_can_mod_title_as_expected 0.05
No log output captured.
Passed suites/acl/valueacl_part2_test.py::test_modify_with_multiple_filters 0.05
No log output captured.
Passed suites/acl/valueacl_part2_test.py::test_denied_by_multiple_filters 0.06
No log output captured.
Passed suites/acl/valueacl_part2_test.py::test_allowed_add_one_attribute 0.05
No log output captured.
Passed suites/acl/valueacl_part2_test.py::test_cannot_add_an_entry_with_attribute_values_we_are_not_allowed_add 0.07
No log output captured.
Passed suites/acl/valueacl_part2_test.py::test_on_modrdn 0.04
No log output captured.
Passed suites/acl/valueacl_part2_test.py::test_on_modrdn_allow 0.32
No log output captured.
Passed suites/acl/valueacl_part2_test.py::test_targattrfilters_keyword 0.10
No log output captured.
Passed suites/acl/valueacl_test.py::test_delete_an_attribute_value_we_are_not_allowed_to_delete 0.08
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/acl/valueacl_test.py::test_donot_allow_write_access_to_title_if_value_is_not_architect 0.07
No log output captured.
Passed suites/acl/valueacl_test.py::test_delete_an_attribute_value_we_are_allowed_to_delete 0.05
No log output captured.
Passed suites/acl/valueacl_test.py::test_delete_an_attribute_value_we_are_not_allowed_to_deleted 0.06
No log output captured.
Passed suites/acl/valueacl_test.py::test_allow_modify_replace 0.07
No log output captured.
Passed suites/acl/valueacl_test.py::test_allow_modify_delete 0.08
No log output captured.
Passed suites/acl/valueacl_test.py::test_replace_an_attribute_if_we_lack 0.07
No log output captured.
Passed suites/acl/valueacl_test.py::test_remove_an_attribute_if_we_have_del_rights_to_all_attr_value 0.05
No log output captured.
Passed suites/acl/valueacl_test.py::test_remove_an_attribute_if_we_donot_have_del_rights_to_all_attr_value 0.06
No log output captured.
Passed suites/acl/valueacl_test.py::test_remove_an_attribute_if_we_have_del_rights_to_all_attr_values 0.05
No log output captured.
Passed suites/acl/valueacl_test.py::test_cantnot_delete_an_entry_with_attribute_values_we_are_not_allowed_delete 0.14
No log output captured.
Passed suites/acl/valueacl_test.py::test_we_can_add_and_delete_an_entry_with_attribute_values_we_are_allowed_add_and_delete 0.06
No log output captured.
Passed suites/acl/valueacl_test.py::test_allow_title 0.07
No log output captured.
Passed suites/acl/valueacl_test.py::test_allow_to_modify 0.06
No log output captured.
Passed suites/acl/valueacl_test.py::test_selfwrite_does_not_confer_write_on_a_targattrfilters_atribute 0.05
No log output captured.
Passed suites/acl/valueacl_test.py::test_selfwrite_continues_to_give_rights_to_attr_in_targetattr_list 0.06
No log output captured.
Passed suites/acl/valueacl_test.py::test_add_an_attribute_value_we_are_allowed_to_add_with_ldapanyone 0.05
No log output captured.
Passed suites/acl/valueacl_test.py::test_hierarchy 0.07
No log output captured.
Passed suites/acl/valueacl_test.py::test_targattrfilters_and_search_permissions_and_that_ldapmodify_works_as_expected 0.27
No log output captured.
Passed suites/acl/valueacl_test.py::test_targattrfilters_and_search_permissions_and_that_ldapmodify_works_as_expected_two 0.01
No log output captured.
Passed suites/attr_encryption/attr_encryption_test.py::test_basic 5.34
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created. INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:35 Enable TLS for attribute encryption INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:38 Enables attribute encryption INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:42 Enables attribute encryption for employeeNumber and telephoneNumber INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:46 Add a test user with encrypted attributes
-------------------------------Captured log call--------------------------------
INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:81 Restart the server INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:87 Extracting values of cn from the list of objects in encrypt_attrs INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:88 And appending the cn values in a list INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:93 Check employeenumber encryption is enabled INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:96 Check telephoneNumber encryption is enabled INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:99 Check that encrypted attribute is present for user i.e. telephonenumber
Passed suites/attr_encryption/attr_encryption_test.py::test_export_import_ciphertext 12.85
------------------------------Captured stderr call------------------------------
ldiffile: /var/lib/dirsrv/slapd-standalone1/ldif/export_ciphertext.ldif
-------------------------------Captured log call--------------------------------
INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:127 Export data as ciphertext INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:138 Check that the attribute is present in the exported file INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:139 Check that the encrypted value of attribute is not present in the exported file INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:145 Delete the test user entry with encrypted data INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:148 Import data as ciphertext, which was exported previously INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:159 Check that the data with encrypted attribute is imported properly
Passed suites/attr_encryption/attr_encryption_test.py::test_export_import_plaintext 15.36
------------------------------Captured stderr call------------------------------
[20/Oct/2020:19:19:35.776598170 -0400] - WARN - Security Initialization - /tmp is not a private namespace. pem files not exported there [20/Oct/2020:19:19:35.785180536 -0400] - INFO - slapd_extract_cert - CA CERT NAME: Self-Signed-CA [20/Oct/2020:19:19:35.788522826 -0400] - ERR - slapd_extract_cert - Unable to open "/tmp/slapd-standalone1/Self-Signed-CA.pem" for writing (-5950, 2). [20/Oct/2020:19:19:35.792052689 -0400] - WARN - Security Initialization - SSL alert: Sending pin request to SVRCore. You may need to run systemd-tty-ask-password-agent to provide the password. [20/Oct/2020:19:19:36.131002868 -0400] - INFO - slapd_extract_cert - SERVER CERT NAME: Server-Cert [20/Oct/2020:19:19:36.134338867 -0400] - WARN - Security Initialization - /tmp is not a private namespace. pem files not exported there [20/Oct/2020:19:19:36.138076991 -0400] - WARN - Security Initialization - /tmp is not a private namespace. pem files not exported there ldiffile: /var/lib/dirsrv/slapd-standalone1/ldif/export_plaintext.ldif [20/Oct/2020:19:19:42.877141355 -0400] - WARN - Security Initialization - /tmp is not a private namespace. pem files not exported there [20/Oct/2020:19:19:42.882340683 -0400] - INFO - slapd_extract_cert - CA CERT NAME: Self-Signed-CA [20/Oct/2020:19:19:42.885047820 -0400] - ERR - slapd_extract_cert - Unable to open "/tmp/slapd-standalone1/Self-Signed-CA.pem" for writing (-5950, 2). [20/Oct/2020:19:19:42.888314340 -0400] - WARN - Security Initialization - SSL alert: Sending pin request to SVRCore. You may need to run systemd-tty-ask-password-agent to provide the password. [20/Oct/2020:19:19:43.092245398 -0400] - INFO - slapd_extract_cert - SERVER CERT NAME: Server-Cert [20/Oct/2020:19:19:43.096355148 -0400] - WARN - Security Initialization - /tmp is not a private namespace. pem files not exported there [20/Oct/2020:19:19:43.100972447 -0400] - WARN - Security Initialization - /tmp is not a private namespace. pem files not exported there
-------------------------------Captured log call--------------------------------
INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:189 Export data as plain text INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:200 Check that the attribute is present in the exported file INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:201 Check that the plain text value of the encrypted attribute is present in the exported file INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:205 Delete the test user entry with encrypted data INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:208 Import data as plain text, which was exported previously INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:219 Check that the attribute is imported properly
Passed suites/attr_encryption/attr_encryption_test.py::test_attr_encryption_unindexed 5.68
------------------------------Captured stderr call------------------------------
ldiffile: /var/lib/dirsrv/slapd-standalone1/ldif/emp_num_ciphertext.ldif
-------------------------------Captured log call--------------------------------
INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:242 Export data as cipher text INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:253 Check that the attribute is present in the exported file INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:254 Check that the encrypted value of attribute is not present in the exported file
Passed suites/attr_encryption/attr_encryption_test.py::test_attr_encryption_multiple_backends 8.74
------------------------------Captured stderr call------------------------------
ldiffile: /var/lib/dirsrv/slapd-standalone1/ldif/export_db1.ldif ldiffile: /var/lib/dirsrv/slapd-standalone1/ldif/export_db2.ldif
-------------------------------Captured log call--------------------------------
INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:287 Add two test backends INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:307 Enables attribute encryption for telephoneNumber in test_backend1 INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:312 Enables attribute encryption for employeeNumber in test_backend2 INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:317 Add a test user with encrypted attributes in both backends INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:326 Export data as ciphertext from both backends INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:343 Check that the attribute is present in the exported file in db1 INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:344 Check that the encrypted value of attribute is not present in the exported file in db1 INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:350 Check that the attribute is present in the exported file in db2 INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:351 Check that the encrypted value of attribute is not present in the exported file in db2 INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:357 Delete test backends
Passed suites/attr_encryption/attr_encryption_test.py::test_attr_encryption_backends 9.80
------------------------------Captured stderr call------------------------------
ldiffile: /var/lib/dirsrv/slapd-standalone1/ldif/export_db1.ldif ldiffile: /var/lib/dirsrv/slapd-standalone1/ldif/export_db2.ldif
-------------------------------Captured log call--------------------------------
INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:386 Add two test backends INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:405 Enables attribute encryption for telephoneNumber in test_backend1 INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:410 Add a test user with telephoneNumber in both backends INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:419 Export data as ciphertext from both backends INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:436 Check that the attribute is present in the exported file in db1 INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:437 Check that the encrypted value of attribute is not present in the exported file in db1 INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:443 Check that the attribute is present in the exported file in db2 INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:444 Check that the value of attribute is also present in the exported file in db2 INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:450 Delete test backends
Passed suites/automember_plugin/automember_mod_test.py::test_mods 11.25
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.automember_plugin.automember_mod_test:automember_mod_test.py:135 Test PASSED
Passed suites/automember_plugin/automember_test.py::test_automemberscope 0.00
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/automember_plugin/automember_test.py::test_automemberfilter 0.26
No log output captured.
Passed suites/automember_plugin/automember_test.py::test_adduser 0.32
No log output captured.
Passed suites/automember_plugin/automember_test.py::test_delete_default_group 4.14
No log output captured.
Passed suites/automember_plugin/automember_test.py::test_no_default_group 4.34
No log output captured.
Passed suites/automember_plugin/automember_test.py::test_delete_target_group 5.15
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_disable_the_plug_in 0.68
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology.
Passed suites/automember_plugin/basic_test.py::test_custom_config_area 0.27
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_ability_to_control_behavior_of_modifiers_name 8.55
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_posixaccount_objectclass_automemberdefaultgroup 0.07
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_duplicated_member_attributes_added_when_the_entry_is_re_created 0.37
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_multi_valued_automemberdefaultgroup_for_hostgroups 0.06
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_plugin_creates_member_attributes_of_the_automemberdefaultgroup 0.07
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_multi_valued_automemberdefaultgroup_with_uniquemember 8.33
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_invalid_automembergroupingattr_member 0.81
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_valid_and_invalid_automembergroupingattr 0.28
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_add_regular_expressions_for_user_groups_and_check_for_member_attribute_after_adding_users 0.16
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_matching_gid_role_inclusive_regular_expression[autoMembers_22-5288-5289-Contractor-5291-5292-Contractors] 0.06
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_matching_gid_role_inclusive_regular_expression[autoMembers_21-1161-1162-Contractor-1162-1163-Contractors] 0.06
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_matching_gid_role_inclusive_regular_expression[autoMembers_20-1188-1189-CEO-1191-1192-Contractors] 0.08
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_matching_gid_role_inclusive_regular_expression[autoMembers_15-9288-9289-Manager-9291-9292-Managers] 0.06
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_matching_gid_role_inclusive_regular_expression[autoMembers_14-561-562-Manager-562-563-Managers] 0.29
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_matching_gid_role_inclusive_regular_expression[autoMembers_13-9788-9789-VPEngg-9392-9393-Managers] 0.09
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_gid_and_role_inclusive_exclusive_regular_expression[autoMembers_26-5788-5789-Intern-Contractors-SuffDef1-5] 0.09
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_gid_and_role_inclusive_exclusive_regular_expression[autoMembers_25-9788-9789-Employee-Contractors-Managers-1] 0.17
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_gid_and_role_inclusive_exclusive_regular_expression[autoMembers_24-1110-1111-Employee-Contractors-SuffDef1-5] 0.08
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_gid_and_role_inclusive_exclusive_regular_expression[autoMembers_23-2788-2789-Contractor-Contractors-SuffDef1-5] 0.07
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_gid_and_role_inclusive_exclusive_regular_expression[autoMembers_19-5788-5789-HRManager-Managers-SuffDef1-5] 0.09
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_gid_and_role_inclusive_exclusive_regular_expression[autoMembers_18-6788-6789-Junior-Managers-SuffDef1-5] 0.08
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_gid_and_role_inclusive_exclusive_regular_expression[autoMembers_17-562-563-Junior-Managers-SuffDef1-5] 0.07
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_gid_and_role_inclusive_exclusive_regular_expression[autoMembers_16-6788-6789-Manager-Managers-SuffDef1-5] 0.27
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_managers_contractors_exclusive_regex_rules_member_uid[autoMembers_32-555-720-Employee-SubDef1-SubDef3] 0.09
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_managers_contractors_exclusive_regex_rules_member_uid[autoMembers_31-515-200-Junior-SubDef1-SubDef5] 0.07
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_managers_contractors_exclusive_regex_rules_member_uid[autoMembers_30-999-400-Supervisor-SubDef1-SubDef2] 0.09
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_managers_contractors_exclusive_regex_rules_member_uid[autoMembers_28-555-3663-ContractHR-Contractors,cn=subsuffGroups-Managers,cn=subsuffGroups] 0.08
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_managers_inclusive_regex_rule[autoMembers_27-595-690-ContractHR-Managers-Contractors] 0.28
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_managers_inclusive_regex_rule[autoMembers_29-8195-2753-Employee-Contractors-Managers] 0.07
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_managers_inclusive_regex_rule[autoMembers_33-545-3333-Supervisor-Contractors-Managers] 0.06
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_managers_inclusive_regex_rule[autoMembers_34-8195-693-Temporary-Managers-Contractors] 0.07
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_reject_invalid_config_and_we_donot_deadlock_the_server 8.69
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_automemtask_re_build_task 10.60
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_automemtask_export_task 9.06
-------------------------------Captured log call--------------------------------
INFO  lib389:tasks.py:1039 Automember Export Updates task (task-10202020_192216) completed successfully
Passed suites/automember_plugin/basic_test.py::test_automemtask_mapping 2.38
-------------------------------Captured log call--------------------------------
INFO  lib389:tasks.py:1087 Automember Map Updates task (task-10202020_192219) completed successfully
Passed suites/automember_plugin/basic_test.py::test_automemtask_re_build 7.51
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_automemtask_export 12.89
-------------------------------Captured log call--------------------------------
INFO  lib389:tasks.py:1039 Automember Export Updates task (task-10202020_192237) completed successfully
Passed suites/automember_plugin/basic_test.py::test_automemtask_run_re_build 21.55
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_automemtask_run_export 14.50
-------------------------------Captured log call--------------------------------
INFO  lib389:tasks.py:1039 Automember Export Updates task (task-10202020_192318) completed successfully
Passed suites/automember_plugin/configuration_test.py::test_configuration 4.30
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/backups/backup_test.py::test_missing_backend 5.64
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/basic/basic_test.py::test_basic_ops 0.48
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/basic/basic_test.py::test_basic_import_export 49.81
------------------------------Captured stderr call------------------------------
ldiffile: /var/lib/dirsrv/slapd-standalone1/ldif/export.ldif
Passed suites/basic/basic_test.py::test_basic_backup 11.57
-------------------------------Captured log call--------------------------------
INFO  lib389:tasks.py:619 Backup task backup_10202020_192442 completed successfully INFO  lib389:tasks.py:673 Restore task restore_10202020_192445 completed successfully
Passed suites/basic/basic_test.py::test_basic_db2index 5.98
------------------------------Captured stderr call------------------------------
[20/Oct/2020:19:24:57.151139950 -0400] - INFO - ldbm_instance_config_cachememsize_set - force a minimal value 512000 [20/Oct/2020:19:24:57.165429284 -0400] - INFO - check_and_set_import_cache - pagesize: 4096, available bytes 7571038208, process usage 22749184 [20/Oct/2020:19:24:57.168923082 -0400] - INFO - check_and_set_import_cache - Import allocates 2957436KB import cache. [20/Oct/2020:19:24:57.173475477 -0400] - INFO - bdb_copy_directory - Backing up file 0 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:24:56.998105/userRoot/givenName.db) [20/Oct/2020:19:24:57.177304636 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/givenName.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:24:56.998105/userRoot/givenName.db [20/Oct/2020:19:24:57.184707991 -0400] - INFO - bdb_copy_directory - Backing up file 1 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:24:56.998105/userRoot/aci.db) [20/Oct/2020:19:24:57.189271959 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/aci.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:24:56.998105/userRoot/aci.db [20/Oct/2020:19:24:57.192620499 -0400] - INFO - bdb_copy_directory - Backing up file 2 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:24:56.998105/userRoot/sn.db) [20/Oct/2020:19:24:57.198116524 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/sn.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:24:56.998105/userRoot/sn.db [20/Oct/2020:19:24:57.201936070 -0400] - INFO - bdb_copy_directory - Backing up file 3 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:24:56.998105/userRoot/numsubordinates.db) [20/Oct/2020:19:24:57.205329599 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/numsubordinates.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:24:56.998105/userRoot/numsubordinates.db [20/Oct/2020:19:24:57.207936414 -0400] - INFO - bdb_copy_directory - Backing up file 4 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:24:56.998105/userRoot/entryusn.db) [20/Oct/2020:19:24:57.210733945 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/entryusn.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:24:56.998105/userRoot/entryusn.db [20/Oct/2020:19:24:57.213398012 -0400] - INFO - bdb_copy_directory - Backing up file 5 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:24:56.998105/userRoot/nsuniqueid.db) [20/Oct/2020:19:24:57.215758869 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/nsuniqueid.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:24:56.998105/userRoot/nsuniqueid.db [20/Oct/2020:19:24:57.219543268 -0400] - INFO - bdb_copy_directory - Backing up file 6 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:24:56.998105/userRoot/ancestorid.db) [20/Oct/2020:19:24:57.222546133 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/ancestorid.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:24:56.998105/userRoot/ancestorid.db [20/Oct/2020:19:24:57.225399937 -0400] - INFO - bdb_copy_directory - Backing up file 7 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:24:56.998105/userRoot/parentid.db) [20/Oct/2020:19:24:57.228251052 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/parentid.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:24:56.998105/userRoot/parentid.db [20/Oct/2020:19:24:57.231066470 -0400] - INFO - bdb_copy_directory - Backing up file 8 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:24:56.998105/userRoot/mail.db) [20/Oct/2020:19:24:57.233827798 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/mail.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:24:56.998105/userRoot/mail.db [20/Oct/2020:19:24:57.236663115 -0400] - INFO - bdb_copy_directory - Backing up file 9 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:24:56.998105/userRoot/telephoneNumber.db) [20/Oct/2020:19:24:57.239511895 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/telephoneNumber.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:24:56.998105/userRoot/telephoneNumber.db [20/Oct/2020:19:24:57.242759376 -0400] - INFO - bdb_copy_directory - Backing up file 10 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:24:56.998105/userRoot/cn.db) [20/Oct/2020:19:24:57.245854638 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/cn.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:24:56.998105/userRoot/cn.db [20/Oct/2020:19:24:57.248976382 -0400] - INFO - bdb_copy_directory - Backing up file 11 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:24:56.998105/userRoot/DBVERSION) [20/Oct/2020:19:24:57.251901273 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/DBVERSION to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:24:56.998105/userRoot/DBVERSION [20/Oct/2020:19:24:57.254671531 -0400] - INFO - bdb_copy_directory - Backing up file 12 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:24:56.998105/userRoot/uid.db) [20/Oct/2020:19:24:57.257262973 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/uid.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:24:56.998105/userRoot/uid.db [20/Oct/2020:19:24:57.260133153 -0400] - INFO - bdb_copy_directory - Backing up file 13 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:24:56.998105/userRoot/entryrdn.db) [20/Oct/2020:19:24:57.262746899 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/entryrdn.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:24:56.998105/userRoot/entryrdn.db [20/Oct/2020:19:24:57.265862584 -0400] - INFO - bdb_copy_directory - Backing up file 14 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:24:56.998105/userRoot/uniquemember.db) [20/Oct/2020:19:24:57.268609762 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/uniquemember.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:24:56.998105/userRoot/uniquemember.db [20/Oct/2020:19:24:57.271339799 -0400] - INFO - bdb_copy_directory - Backing up file 15 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:24:56.998105/userRoot/id2entry.db) [20/Oct/2020:19:24:57.280868053 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/id2entry.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:24:56.998105/userRoot/id2entry.db [20/Oct/2020:19:24:57.283968509 -0400] - INFO - bdb_copy_directory - Backing up file 16 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:24:56.998105/userRoot/objectclass.db) [20/Oct/2020:19:24:57.295528825 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/objectclass.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:24:56.998105/userRoot/objectclass.db [20/Oct/2020:19:24:57.299119202 -0400] - INFO - upgradedb_core - userRoot: Start upgradedb. [20/Oct/2020:19:24:57.302395956 -0400] - INFO - bdb_instance_start - Import is running with nsslapd-db-private-import-mem on; No other process is allowed to access the database [20/Oct/2020:19:24:57.305319806 -0400] - INFO - check_and_set_import_cache - pagesize: 4096, available bytes 7570518016, process usage 23670784 [20/Oct/2020:19:24:57.308734336 -0400] - INFO - check_and_set_import_cache - Import allocates 2957233KB import cache. [20/Oct/2020:19:24:57.532701700 -0400] - INFO - bdb_import_main - reindex userRoot: Index buffering enabled with bucket size 100 [20/Oct/2020:19:24:58.238590404 -0400] - INFO - import_monitor_threads - reindex userRoot: Workers finished; cleaning up... [20/Oct/2020:19:24:58.443496597 -0400] - INFO - import_monitor_threads - reindex userRoot: Workers cleaned up. [20/Oct/2020:19:24:58.446197316 -0400] - INFO - bdb_import_main - reindex userRoot: Cleaning up producer thread... [20/Oct/2020:19:24:58.449468176 -0400] - INFO - bdb_import_main - reindex userRoot: Indexing complete. Post-processing... [20/Oct/2020:19:24:58.452570834 -0400] - INFO - bdb_import_main - reindex userRoot: Generating numsubordinates (this may take several minutes to complete)... [20/Oct/2020:19:24:58.455193434 -0400] - INFO - bdb_import_main - reindex userRoot: Generating numSubordinates complete. [20/Oct/2020:19:24:58.464365456 -0400] - INFO - bdb_get_nonleaf_ids - reindex userRoot: Gathering ancestorid non-leaf IDs... [20/Oct/2020:19:24:58.467429518 -0400] - INFO - bdb_get_nonleaf_ids - reindex userRoot: Finished gathering ancestorid non-leaf IDs. [20/Oct/2020:19:24:58.470111809 -0400] - INFO - ldbm_get_nonleaf_ids - reindex userRoot: Starting sort of ancestorid non-leaf IDs... [20/Oct/2020:19:24:58.472705543 -0400] - INFO - ldbm_get_nonleaf_ids - reindex userRoot: Finished sort of ancestorid non-leaf IDs. [20/Oct/2020:19:24:58.479163048 -0400] - INFO - bdb_ancestorid_new_idl_create_index - reindex userRoot: Creating ancestorid index (new idl)... [20/Oct/2020:19:24:58.482506020 -0400] - INFO - bdb_ancestorid_new_idl_create_index - reindex userRoot: Created ancestorid index (new idl). [20/Oct/2020:19:24:58.485087563 -0400] - INFO - bdb_import_main - reindex userRoot: Flushing caches... [20/Oct/2020:19:24:58.487895555 -0400] - INFO - bdb_import_main - reindex userRoot: Closing files... [20/Oct/2020:19:24:58.602919051 -0400] - INFO - bdb_import_main - reindex userRoot: Reindexing complete. Processed 160 entries in 1 seconds. (160.00 entries/sec) [20/Oct/2020:19:24:58.607882172 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/log.0000000001 to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:24:56.998105/log.0000000001 [20/Oct/2020:19:24:58.623213790 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/DBVERSION to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:24:56.998105/DBVERSION [20/Oct/2020:19:24:58.626566870 -0400] - INFO - bdb_pre_close - All database threads now stopped [20/Oct/2020:19:24:58.815863895 -0400] - INFO - slapd_exemode_db2index - Backend Instance: userRoot [20/Oct/2020:19:24:58.827018871 -0400] - INFO - ldbm_instance_config_cachememsize_set - force a minimal value 512000 [20/Oct/2020:19:24:58.834391848 -0400] - INFO - bdb_instance_start - Import is running with nsslapd-db-private-import-mem on; No other process is allowed to access the database [20/Oct/2020:19:24:58.837557818 -0400] - INFO - check_and_set_import_cache - pagesize: 4096, available bytes 7573409792, process usage 22986752 [20/Oct/2020:19:24:58.840135042 -0400] - INFO - check_and_set_import_cache - Import allocates 2958363KB import cache. [20/Oct/2020:19:24:58.988627379 -0400] - INFO - bdb_db2index - userRoot: Indexing attribute: uid [20/Oct/2020:19:24:58.991766036 -0400] - ERR - libdb - BDB1566 txn_checkpoint interface requires an environment configured for the transaction subsystem [20/Oct/2020:19:24:58.994200532 -0400] - ERR - dblayer_force_checkpoint - Checkpoint FAILED, error Invalid argument (22) [20/Oct/2020:19:24:59.006899433 -0400] - INFO - bdb_db2index - userRoot: Finished indexing. [20/Oct/2020:19:24:59.031917987 -0400] - INFO - bdb_pre_close - All database threads now stopped
Passed suites/basic/basic_test.py::test_basic_acl 0.78
No log output captured.
Passed suites/basic/basic_test.py::test_basic_searches 0.25
No log output captured.
Passed suites/basic/basic_test.py::test_search_req_attrs[attrs0-cn-False] 0.00
No log output captured.
Passed suites/basic/basic_test.py::test_search_req_attrs[attrs1-cn-True] 0.00
No log output captured.
Passed suites/basic/basic_test.py::test_search_req_attrs[attrs2-nsUniqueId-True] 0.00
No log output captured.
Passed suites/basic/basic_test.py::test_search_req_attrs[attrs3-cn-True] 0.00
No log output captured.
Passed suites/basic/basic_test.py::test_search_req_attrs[attrs4-cn-True] 0.16
No log output captured.
Passed suites/basic/basic_test.py::test_basic_referrals 3.49
No log output captured.
Passed suites/basic/basic_test.py::test_basic_systemctl 11.89
No log output captured.
Passed suites/basic/basic_test.py::test_basic_ldapagent 5.02
No log output captured.
Passed suites/basic/basic_test.py::test_basic_dse_survives_kill9 11.40
No log output captured.
Passed suites/basic/basic_test.py::test_def_rootdse_attr[namingContexts] 0.02
No log output captured.
Passed suites/basic/basic_test.py::test_def_rootdse_attr[supportedLDAPVersion] 0.02
No log output captured.
Passed suites/basic/basic_test.py::test_def_rootdse_attr[supportedControl] 0.02
No log output captured.
Passed suites/basic/basic_test.py::test_def_rootdse_attr[supportedExtension] 0.17
No log output captured.
Passed suites/basic/basic_test.py::test_def_rootdse_attr[supportedSASLMechanisms] 0.02
No log output captured.
Passed suites/basic/basic_test.py::test_def_rootdse_attr[vendorName] 0.02
No log output captured.
Passed suites/basic/basic_test.py::test_def_rootdse_attr[vendorVersion] 0.17
No log output captured.
Passed suites/basic/basic_test.py::test_mod_def_rootdse_attr[namingContexts] 0.00
No log output captured.
Passed suites/basic/basic_test.py::test_mod_def_rootdse_attr[supportedLDAPVersion] 0.00
No log output captured.
Passed suites/basic/basic_test.py::test_mod_def_rootdse_attr[supportedControl] 0.00
No log output captured.
Passed suites/basic/basic_test.py::test_mod_def_rootdse_attr[supportedExtension] 0.00
No log output captured.
Passed suites/basic/basic_test.py::test_mod_def_rootdse_attr[supportedSASLMechanisms] 0.00
No log output captured.
Passed suites/basic/basic_test.py::test_mod_def_rootdse_attr[vendorName] 0.01
No log output captured.
Passed suites/basic/basic_test.py::test_mod_def_rootdse_attr[vendorVersion] 0.00
No log output captured.
Passed suites/basic/basic_test.py::test_basic_anonymous_search 0.03
No log output captured.
Passed suites/basic/basic_test.py::test_search_original_type 0.02
No log output captured.
Passed suites/basic/basic_test.py::test_search_ou 0.01
No log output captured.
Passed suites/basic/basic_test.py::test_connection_buffer_size 0.02
No log output captured.
Passed suites/basic/basic_test.py::test_critical_msg_on_empty_range_idl 7.42
No log output captured.
Passed suites/basic/basic_test.py::test_ldbm_modification_audit_log 12.35
No log output captured.
Passed suites/basic/basic_test.py::test_dscreate_with_different_rdn[wrong=some_value] 0.42
-----------------------------Captured stdout setup------------------------------
No such instance 'test_different_rdn' Unable to access instance information. Are you running as the correct user? (usually dirsrv or root)
-------------------------------Captured log setup-------------------------------
CRITICAL tests.suites.basic.basic_test:basic_test.py:1441 Failed to remove test instance Error (1) None
------------------------------Captured stdout call------------------------------
Starting installation... Error: port 38999 is already in use, or missing NET_BIND_SERVICE
-------------------------------Captured log call--------------------------------
CRITICAL tests.suites.basic.basic_test:basic_test.py:1474 dscreate failed! Error (1) None
Passed suites/betxns/betxn_test.py::test_betxt_7bit 4.73
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.betxns.betxn_test:betxn_test.py:52 Running test_betxt_7bit... INFO  tests.suites.betxns.betxn_test:betxn_test.py:78 test_betxt_7bit: PASSED
Passed suites/betxns/betxn_test.py::test_betxn_attr_uniqueness 4.26
-------------------------------Captured log call--------------------------------
INFO  tests.suites.betxns.betxn_test:betxn_test.py:133 test_betxn_attr_uniqueness: PASSED
Passed suites/betxns/betxn_test.py::test_betxn_memberof 5.73
-------------------------------Captured log call--------------------------------
INFO  tests.suites.betxns.betxn_test:betxn_test.py:179 test_betxn_memberof: PASSED
Passed suites/betxns/betxn_test.py::test_betxn_modrdn_memberof_cache_corruption 3.84
-------------------------------Captured log call--------------------------------
INFO  tests.suites.betxns.betxn_test:betxn_test.py:233 test_betxn_modrdn_memberof: PASSED
Passed suites/betxns/betxn_test.py::test_ri_and_mep_cache_corruption 0.69
-------------------------------Captured log call--------------------------------
INFO  tests.suites.betxns.betxn_test:betxn_test.py:357 Test PASSED
Passed suites/clu/clu_test.py::test_clu_pwdhash 0.04
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.clu.clu_test:clu_test.py:40 Running test_clu_pwdhash... INFO  tests.suites.clu.clu_test:clu_test.py:54 pwdhash generated: {SSHA}MYXr5JxD03DoE0LioY5cMKrHtYNWsoxEssn2IA== INFO  tests.suites.clu.clu_test:clu_test.py:55 test_clu_pwdhash: PASSED
Passed suites/clu/clu_test.py::test_clu_pwdhash_mod 0.30
-------------------------------Captured log call--------------------------------
INFO  tests.suites.clu.clu_test:clu_test.py:78 Running test_clu_pwdhash_mod... INFO  tests.suites.clu.clu_test:clu_test.py:87 pwdhash generated: {SSHA256}LUL5zCIBSCjtscD6likXaIigEMiXHa/GVlWfO6MiRT+8pE8l7DiDHw== INFO  tests.suites.clu.clu_test:clu_test.py:88 returned the hashed string using the algorithm set in nsslapd-rootpwstoragescheme
Passed suites/clu/dbgen_test.py::test_dsconf_dbgen_users 6.70
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.clu.dbgen_test:dbgen_test.py:119 Run ldifgen to create users ldif INFO  tests.suites.clu.dbgen_test:dbgen.py:56 Generating LDIF with the following options: INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - suffix=dc=example,dc=com INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - parent=ou=people,dc=example,dc=com INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - number=1000 INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - rdn-cn=False INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - generic=True INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - start-idx=50 INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - localize=False INFO  tests.suites.clu.dbgen_test:dbgen.py:62 - ldif-file=/var/lib/dirsrv/slapd-standalone1/ldif/created.ldif INFO  tests.suites.clu.dbgen_test:dbgen.py:63 Writing LDIF ... INFO  tests.suites.clu.dbgen_test:dbgen.py:196 Successfully created LDIF file: /var/lib/dirsrv/slapd-standalone1/ldif/created.ldif INFO  tests.suites.clu.dbgen_test:dbgen_test.py:122 Check if file exists INFO  tests.suites.clu.dbgen_test:dbgen_test.py:67 Check if content is present in output INFO  tests.suites.clu.dbgen_test:dbgen_test.py:71 Reset log file for next test INFO  tests.suites.clu.dbgen_test:dbgen_test.py:127 Get number of accounts before import INFO  tests.suites.clu.dbgen_test:dbgen_test.py:48 Stopping the server and running offline import... INFO  tests.suites.clu.dbgen_test:dbgen_test.py:133 Check that accounts are imported
Passed suites/clu/dbgen_test.py::test_dsconf_dbgen_groups 32.37
------------------------------Captured stderr call------------------------------
ldap_add: Already exists (68) ldap_add: Already exists (68) ldap_add: Already exists (68)
-------------------------------Captured log call--------------------------------
INFO  tests.suites.clu.dbgen_test:dbgen_test.py:183 Run ldifgen to create group ldif INFO  tests.suites.clu.dbgen_test:dbgen.py:56 Generating LDIF with the following options: INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - NAME=myGroup INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - parent=ou=groups,dc=example,dc=com INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - suffix=dc=example,dc=com INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - number=1 INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - num-members=1000 INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - create-members=True INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - member-attr=uniquemember INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - member-parent=ou=people,dc=example,dc=com INFO  tests.suites.clu.dbgen_test:dbgen.py:62 - ldif-file=/var/lib/dirsrv/slapd-standalone1/ldif/created.ldif INFO  tests.suites.clu.dbgen_test:dbgen.py:63 Writing LDIF ... INFO  tests.suites.clu.dbgen_test:dbgen.py:250 Successfully created LDIF file: /var/lib/dirsrv/slapd-standalone1/ldif/created.ldif INFO  tests.suites.clu.dbgen_test:dbgen_test.py:186 Check if file exists INFO  tests.suites.clu.dbgen_test:dbgen_test.py:67 Check if content is present in output INFO  tests.suites.clu.dbgen_test:dbgen_test.py:71 Reset log file for next test INFO  tests.suites.clu.dbgen_test:dbgen_test.py:191 Get number of accounts before import INFO  tests.suites.clu.dbgen_test:dbgen_test.py:57 Add entries from ldif file with ldapmodify INFO  tests.suites.clu.dbgen_test:dbgen_test.py:200 Check that accounts are imported INFO  tests.suites.clu.dbgen_test:dbgen_test.py:203 Check that group is imported
Passed suites/clu/dbgen_test.py::test_dsconf_dbgen_cos_classic 0.13
-------------------------------Captured log call--------------------------------
INFO  tests.suites.clu.dbgen_test:dbgen_test.py:256 Run ldifgen to create COS definition ldif INFO  tests.suites.clu.dbgen_test:dbgen.py:56 Generating LDIF with the following options: INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - type=classic INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - NAME=My_Postal_Def INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - parent=ou=cos definitions,dc=example,dc=com INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - create-parent=True INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - cos-specifier=businessCategory INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - cos-attr=['postalcode', 'telephonenumber'] INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - cos-template=cn=sales,cn=classicCoS,dc=example,dc=com INFO  tests.suites.clu.dbgen_test:dbgen.py:62 - ldif-file=/var/lib/dirsrv/slapd-standalone1/ldif/created.ldif INFO  tests.suites.clu.dbgen_test:dbgen.py:63 Writing LDIF ... INFO  tests.suites.clu.dbgen_test:dbgen.py:304 Successfully created LDIF file: /var/lib/dirsrv/slapd-standalone1/ldif/created.ldif INFO  tests.suites.clu.dbgen_test:dbgen_test.py:259 Check if file exists INFO  tests.suites.clu.dbgen_test:dbgen_test.py:67 Check if content is present in output INFO  tests.suites.clu.dbgen_test:dbgen_test.py:71 Reset log file for next test INFO  tests.suites.clu.dbgen_test:dbgen_test.py:57 Add entries from ldif file with ldapmodify INFO  tests.suites.clu.dbgen_test:dbgen_test.py:267 Check that COS definition is imported
Passed suites/clu/dbgen_test.py::test_dsconf_dbgen_cos_pointer 0.03
-------------------------------Captured log call--------------------------------
INFO  tests.suites.clu.dbgen_test:dbgen_test.py:322 Run ldifgen to create COS definition ldif INFO  tests.suites.clu.dbgen_test:dbgen.py:56 Generating LDIF with the following options: INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - type=pointer INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - NAME=My_Postal_Def_pointer INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - parent=ou=cos pointer definitions,dc=example,dc=com INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - create-parent=True INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - cos-attr=['postalcode', 'telephonenumber'] INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - cos-template=cn=sales,cn=pointerCoS,dc=example,dc=com INFO  tests.suites.clu.dbgen_test:dbgen.py:62 - ldif-file=/var/lib/dirsrv/slapd-standalone1/ldif/created.ldif INFO  tests.suites.clu.dbgen_test:dbgen.py:63 Writing LDIF ... INFO  tests.suites.clu.dbgen_test:dbgen.py:304 Successfully created LDIF file: /var/lib/dirsrv/slapd-standalone1/ldif/created.ldif INFO  tests.suites.clu.dbgen_test:dbgen_test.py:325 Check if file exists INFO  tests.suites.clu.dbgen_test:dbgen_test.py:67 Check if content is present in output INFO  tests.suites.clu.dbgen_test:dbgen_test.py:71 Reset log file for next test INFO  tests.suites.clu.dbgen_test:dbgen_test.py:57 Add entries from ldif file with ldapmodify INFO  tests.suites.clu.dbgen_test:dbgen_test.py:333 Check that COS definition is imported
Passed suites/clu/dbgen_test.py::test_dsconf_dbgen_cos_indirect 0.03
-------------------------------Captured log call--------------------------------
INFO  tests.suites.clu.dbgen_test:dbgen_test.py:387 Run ldifgen to create COS definition ldif INFO  tests.suites.clu.dbgen_test:dbgen.py:56 Generating LDIF with the following options: INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - type=indirect INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - NAME=My_Postal_Def_indirect INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - parent=ou=cos indirect definitions,dc=example,dc=com INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - create-parent=True INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - cos-specifier=businessCategory INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - cos-attr=['postalcode', 'telephonenumber'] INFO  tests.suites.clu.dbgen_test:dbgen.py:62 - ldif-file=/var/lib/dirsrv/slapd-standalone1/ldif/created.ldif INFO  tests.suites.clu.dbgen_test:dbgen.py:63 Writing LDIF ... INFO  tests.suites.clu.dbgen_test:dbgen.py:304 Successfully created LDIF file: /var/lib/dirsrv/slapd-standalone1/ldif/created.ldif INFO  tests.suites.clu.dbgen_test:dbgen_test.py:390 Check if file exists INFO  tests.suites.clu.dbgen_test:dbgen_test.py:67 Check if content is present in output INFO  tests.suites.clu.dbgen_test:dbgen_test.py:71 Reset log file for next test INFO  tests.suites.clu.dbgen_test:dbgen_test.py:57 Add entries from ldif file with ldapmodify INFO  tests.suites.clu.dbgen_test:dbgen_test.py:398 Check that COS definition is imported
Passed suites/clu/dbgen_test.py::test_dsconf_dbgen_cos_template 0.13
-------------------------------Captured log call--------------------------------
INFO  tests.suites.clu.dbgen_test:dbgen_test.py:449 Run ldifgen to create COS template ldif INFO  tests.suites.clu.dbgen_test:dbgen.py:56 Generating LDIF with the following options: INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - NAME=My_Template INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - parent=ou=cos templates,dc=example,dc=com INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - create-parent=True INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - cos-priority=1 INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - cos-attr-val=postalcode:12345 INFO  tests.suites.clu.dbgen_test:dbgen.py:62 - ldif-file=/var/lib/dirsrv/slapd-standalone1/ldif/created.ldif INFO  tests.suites.clu.dbgen_test:dbgen.py:63 Writing LDIF ... INFO  tests.suites.clu.dbgen_test:dbgen.py:341 Successfully created LDIF file: /var/lib/dirsrv/slapd-standalone1/ldif/created.ldif INFO  tests.suites.clu.dbgen_test:dbgen_test.py:452 Check if file exists INFO  tests.suites.clu.dbgen_test:dbgen_test.py:67 Check if content is present in output INFO  tests.suites.clu.dbgen_test:dbgen_test.py:71 Reset log file for next test INFO  tests.suites.clu.dbgen_test:dbgen_test.py:57 Add entries from ldif file with ldapmodify INFO  tests.suites.clu.dbgen_test:dbgen_test.py:460 Check that COS template is imported
Passed suites/clu/dbgen_test.py::test_dsconf_dbgen_managed_role 0.03
-------------------------------Captured log call--------------------------------
INFO  tests.suites.clu.dbgen_test:dbgen_test.py:511 Run ldifgen to create managed role ldif INFO  tests.suites.clu.dbgen_test:dbgen.py:56 Generating LDIF with the following options: INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - NAME=My_Managed_Role INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - parent=ou=managed roles,dc=example,dc=com INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - create-parent=True INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - type=managed INFO  tests.suites.clu.dbgen_test:dbgen.py:62 - ldif-file=/var/lib/dirsrv/slapd-standalone1/ldif/created.ldif INFO  tests.suites.clu.dbgen_test:dbgen.py:63 Writing LDIF ... INFO  tests.suites.clu.dbgen_test:dbgen.py:391 Successfully created LDIF file: /var/lib/dirsrv/slapd-standalone1/ldif/created.ldif INFO  tests.suites.clu.dbgen_test:dbgen_test.py:514 Check if file exists INFO  tests.suites.clu.dbgen_test:dbgen_test.py:67 Check if content is present in output INFO  tests.suites.clu.dbgen_test:dbgen_test.py:71 Reset log file for next test INFO  tests.suites.clu.dbgen_test:dbgen_test.py:57 Add entries from ldif file with ldapmodify INFO  tests.suites.clu.dbgen_test:dbgen_test.py:522 Check that managed role is imported
Passed suites/clu/dbgen_test.py::test_dsconf_dbgen_filtered_role 0.03
-------------------------------Captured log call--------------------------------
INFO  tests.suites.clu.dbgen_test:dbgen_test.py:571 Run ldifgen to create filtered role ldif INFO  tests.suites.clu.dbgen_test:dbgen.py:56 Generating LDIF with the following options: INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - NAME=My_Filtered_Role INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - parent=ou=filtered roles,dc=example,dc=com INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - create-parent=True INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - type=filtered INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - filter="objectclass=posixAccount" INFO  tests.suites.clu.dbgen_test:dbgen.py:62 - ldif-file=/var/lib/dirsrv/slapd-standalone1/ldif/created.ldif INFO  tests.suites.clu.dbgen_test:dbgen.py:63 Writing LDIF ... INFO  tests.suites.clu.dbgen_test:dbgen.py:391 Successfully created LDIF file: /var/lib/dirsrv/slapd-standalone1/ldif/created.ldif INFO  tests.suites.clu.dbgen_test:dbgen_test.py:574 Check if file exists INFO  tests.suites.clu.dbgen_test:dbgen_test.py:67 Check if content is present in output INFO  tests.suites.clu.dbgen_test:dbgen_test.py:71 Reset log file for next test INFO  tests.suites.clu.dbgen_test:dbgen_test.py:57 Add entries from ldif file with ldapmodify INFO  tests.suites.clu.dbgen_test:dbgen_test.py:582 Check that filtered role is imported
Passed suites/clu/dbgen_test.py::test_dsconf_dbgen_nested_role 0.03
-------------------------------Captured log call--------------------------------
INFO  tests.suites.clu.dbgen_test:dbgen_test.py:632 Run ldifgen to create nested role ldif INFO  tests.suites.clu.dbgen_test:dbgen.py:56 Generating LDIF with the following options: INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - NAME=My_Nested_Role INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - parent=ou=nested roles,dc=example,dc=com INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - create-parent=True INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - type=nested INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - role-dn=['cn=some_role,ou=roles,dc=example,dc=com'] INFO  tests.suites.clu.dbgen_test:dbgen.py:62 - ldif-file=/var/lib/dirsrv/slapd-standalone1/ldif/created.ldif INFO  tests.suites.clu.dbgen_test:dbgen.py:63 Writing LDIF ... INFO  tests.suites.clu.dbgen_test:dbgen.py:391 Successfully created LDIF file: /var/lib/dirsrv/slapd-standalone1/ldif/created.ldif INFO  tests.suites.clu.dbgen_test:dbgen_test.py:635 Check if file exists INFO  tests.suites.clu.dbgen_test:dbgen_test.py:67 Check if content is present in output INFO  tests.suites.clu.dbgen_test:dbgen_test.py:71 Reset log file for next test INFO  tests.suites.clu.dbgen_test:dbgen_test.py:57 Add entries from ldif file with ldapmodify INFO  tests.suites.clu.dbgen_test:dbgen_test.py:643 Check that nested role is imported
Passed suites/clu/dbgen_test.py::test_dsconf_dbgen_mod_ldif_mixed 37.56
------------------------------Captured stderr call------------------------------
ldap_modify: Operation not allowed on RDN (67) ldap_modify: Operation not allowed on RDN (67) ldap_modify: Operation not allowed on RDN (67) ldap_modify: Operation not allowed on RDN (67) ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldapmodify: extra lines at end (line 43453, entry "uid=user0999,dc=example,dc=com")
-------------------------------Captured log call--------------------------------
INFO  tests.suites.clu.dbgen_test:dbgen_test.py:702 Run ldifgen to create modification ldif INFO  tests.suites.clu.dbgen_test:dbgen.py:56 Generating LDIF with the following options: INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - parent=dc=example,dc=com INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - create-users=True INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - delete-users=True INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - create-parent=False INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - num-users=1000 INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - add-users=100 INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - del-users=999 INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - modrdn-users=100 INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - mod-users=10 INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - mod-attrs=['cn', 'uid', 'sn'] INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - randomize=False INFO  tests.suites.clu.dbgen_test:dbgen.py:62 - ldif-file=/var/lib/dirsrv/slapd-standalone1/ldif/created.ldif INFO  tests.suites.clu.dbgen_test:dbgen.py:63 Writing LDIF ... INFO  tests.suites.clu.dbgen_test:dbgen.py:467 Successfully created LDIF file: /var/lib/dirsrv/slapd-standalone1/ldif/created.ldif INFO  tests.suites.clu.dbgen_test:dbgen_test.py:705 Check if file exists INFO  tests.suites.clu.dbgen_test:dbgen_test.py:67 Check if content is present in output INFO  tests.suites.clu.dbgen_test:dbgen_test.py:71 Reset log file for next test INFO  tests.suites.clu.dbgen_test:dbgen_test.py:710 Get number of accounts before import INFO  tests.suites.clu.dbgen_test:dbgen_test.py:57 Add entries from ldif file with ldapmodify INFO  tests.suites.clu.dbgen_test:dbgen_test.py:719 Check that some accounts are imported
Passed suites/clu/dbgen_test.py::test_dsconf_dbgen_nested_ldif 25.80
------------------------------Captured stderr call------------------------------
ldap_add: Already exists (68)
-------------------------------Captured log call--------------------------------
INFO  tests.suites.clu.dbgen_test:dbgen_test.py:759 Run ldifgen to create nested ldif INFO  tests.suites.clu.dbgen_test:dbgen.py:56 Generating LDIF with the following options: INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - suffix=dc=example,dc=com INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - node-limit=100 INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - num-users=600 INFO  tests.suites.clu.dbgen_test:dbgen.py:62 - ldif-file=/var/lib/dirsrv/slapd-standalone1/ldif/created.ldif INFO  tests.suites.clu.dbgen_test:dbgen.py:63 Writing LDIF ... INFO  tests.suites.clu.dbgen_test:dbgen.py:500 Successfully created nested LDIF file (/var/lib/dirsrv/slapd-standalone1/ldif/created.ldif) containing 6 nodes/subtrees INFO  tests.suites.clu.dbgen_test:dbgen_test.py:762 Check if file exists INFO  tests.suites.clu.dbgen_test:dbgen_test.py:67 Check if content is present in output INFO  tests.suites.clu.dbgen_test:dbgen_test.py:71 Reset log file for next test INFO  tests.suites.clu.dbgen_test:dbgen_test.py:767 Get number of accounts before import INFO  tests.suites.clu.dbgen_test:dbgen_test.py:57 Add entries from ldif file with ldapmodify INFO  tests.suites.clu.dbgen_test:dbgen_test.py:779 Check that accounts are imported
Passed suites/clu/dbmon_test.py::test_dsconf_dbmon 1.06
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.clu.dbmon_test:dbmon_test.py:164 Sanity check for syntax INFO  LogCapture:monitor.py:247 DB Monitor Report: 2020-10-20 19:28:43 INFO  LogCapture:monitor.py:248 -------------------------------------------------------- INFO  LogCapture:monitor.py:249 Database Cache: INFO  LogCapture:monitor.py:250 - Cache Hit Ratio: 70% INFO  LogCapture:monitor.py:251 - Free Space: 487.05 MB INFO  LogCapture:monitor.py:252 - Free Percentage: 100.0% INFO  LogCapture:monitor.py:253 - RO Page Drops: 0 INFO  LogCapture:monitor.py:254 - Pages In: 8 INFO  LogCapture:monitor.py:255 - Pages Out: 0 INFO  LogCapture:monitor.py:256 INFO  LogCapture:monitor.py:257 Normalized DN Cache: INFO  LogCapture:monitor.py:258 - Cache Hit Ratio: 79% INFO  LogCapture:monitor.py:259 - Free Space: 19.99 MB INFO  LogCapture:monitor.py:260 - Free Percentage: 100.0% INFO  LogCapture:monitor.py:261 - DN Count: 53 INFO  LogCapture:monitor.py:262 - Evictions: 0 INFO  LogCapture:monitor.py:263 INFO  LogCapture:monitor.py:264 Backends: INFO  LogCapture:monitor.py:266 - dc=example,dc=com (userRoot): INFO  LogCapture:monitor.py:267 - Entry Cache Hit Ratio: 57% INFO  LogCapture:monitor.py:268 - Entry Cache Count: 2 INFO  LogCapture:monitor.py:269 - Entry Cache Free Space: 1.31 GB INFO  LogCapture:monitor.py:270 - Entry Cache Free Percentage: 100.0% INFO  LogCapture:monitor.py:271 - Entry Cache Average Size: 3.78 KB INFO  LogCapture:monitor.py:272 - DN Cache Hit Ratio: 0% INFO  LogCapture:monitor.py:273 - DN Cache Count: 2 INFO  LogCapture:monitor.py:274 - DN Cache Free Space: 192.0 MB INFO  LogCapture:monitor.py:275 - DN Cache Free Percentage: 100.0% INFO  LogCapture:monitor.py:276 - DN Cache Average Size: 63.0 B INFO  LogCapture:monitor.py:286 INFO  tests.suites.clu.dbmon_test:dbmon_test.py:133 Clear the log INFO  tests.suites.clu.dbmon_test:dbmon_test.py:171 Sanity check for --indexes output INFO  LogCapture:monitor.py:247 DB Monitor Report: 2020-10-20 19:28:43 INFO  LogCapture:monitor.py:248 -------------------------------------------------------- INFO  LogCapture:monitor.py:249 Database Cache: INFO  LogCapture:monitor.py:250 - Cache Hit Ratio: 70% INFO  LogCapture:monitor.py:251 - Free Space: 487.05 MB INFO  LogCapture:monitor.py:252 - Free Percentage: 100.0% INFO  LogCapture:monitor.py:253 - RO Page Drops: 0 INFO  LogCapture:monitor.py:254 - Pages In: 8 INFO  LogCapture:monitor.py:255 - Pages Out: 0 INFO  LogCapture:monitor.py:256 INFO  LogCapture:monitor.py:257 Normalized DN Cache: INFO  LogCapture:monitor.py:258 - Cache Hit Ratio: 79% INFO  LogCapture:monitor.py:259 - Free Space: 19.99 MB INFO  LogCapture:monitor.py:260 - Free Percentage: 100.0% INFO  LogCapture:monitor.py:261 - DN Count: 53 INFO  LogCapture:monitor.py:262 - Evictions: 0 INFO  LogCapture:monitor.py:263 INFO  LogCapture:monitor.py:264 Backends: INFO  LogCapture:monitor.py:266 - dc=example,dc=com (userRoot): INFO  LogCapture:monitor.py:267 - Entry Cache Hit Ratio: 57% INFO  LogCapture:monitor.py:268 - Entry Cache Count: 2 INFO  LogCapture:monitor.py:269 - Entry Cache Free Space: 1.31 GB INFO  LogCapture:monitor.py:270 - Entry Cache Free Percentage: 100.0% INFO  LogCapture:monitor.py:271 - Entry Cache Average Size: 3.78 KB INFO  LogCapture:monitor.py:272 - DN Cache Hit Ratio: 0% INFO  LogCapture:monitor.py:273 - DN Cache Count: 2 INFO  LogCapture:monitor.py:274 - DN Cache Free Space: 192.0 MB INFO  LogCapture:monitor.py:275 - DN Cache Free Percentage: 100.0% INFO  LogCapture:monitor.py:276 - DN Cache Average Size: 63.0 B INFO  LogCapture:monitor.py:278 - Indexes: INFO  LogCapture:monitor.py:280 - Index: objectclass.db INFO  LogCapture:monitor.py:281 - Cache Hit: 11 INFO  LogCapture:monitor.py:282 - Cache Miss: 2 INFO  LogCapture:monitor.py:283 - Page In: 2 INFO  LogCapture:monitor.py:284 - Page Out: 0 INFO  LogCapture:monitor.py:285 INFO  LogCapture:monitor.py:280 - Index: entryrdn.db INFO  LogCapture:monitor.py:281 - Cache Hit: 5 INFO  LogCapture:monitor.py:282 - Cache Miss: 2 INFO  LogCapture:monitor.py:283 - Page In: 2 INFO  LogCapture:monitor.py:284 - Page Out: 0 INFO  LogCapture:monitor.py:285 INFO  LogCapture:monitor.py:280 - Index: id2entry.db INFO  LogCapture:monitor.py:281 - Cache Hit: 2 INFO  LogCapture:monitor.py:282 - Cache Miss: 2 INFO  LogCapture:monitor.py:283 - Page In: 2 INFO  LogCapture:monitor.py:284 - Page Out: 0 INFO  LogCapture:monitor.py:285 INFO  LogCapture:monitor.py:280 - Index: aci.db INFO  LogCapture:monitor.py:281 - Cache Hit: 1 INFO  LogCapture:monitor.py:282 - Cache Miss: 2 INFO  LogCapture:monitor.py:283 - Page In: 2 INFO  LogCapture:monitor.py:284 - Page Out: 0 INFO  LogCapture:monitor.py:285 INFO  LogCapture:monitor.py:286 INFO  tests.suites.clu.dbmon_test:dbmon_test.py:133 Clear the log INFO  tests.suites.clu.dbmon_test:dbmon_test.py:179 Sanity check for --json output INFO  LogCapture:monitor.py:245 { "date": "2020-10-20 19:28:44", "dbcache": { "hit_ratio": "70", "free": "487.05 MB", "free_percentage": "100.0", "roevicts": "0", "pagein": "8", "pageout": "0" }, "ndncache": { "hit_ratio": "79", "free": "19.99 MB", "free_percentage": "100.0", "count": "53", "evictions": "0" }, "backends": { "userRoot": { "suffix": "dc=example,dc=com", "entry_cache_count": "2", "entry_cache_free": "1.31 GB", "entry_cache_free_percentage": "100.0", "entry_cache_size": "3.78 KB", "entry_cache_hit_ratio": "57", "dn_cache_count": "2", "dn_cache_free": "192.0 MB", "dn_cache_free_percentage": "100.0", "dn_cache_size": "63.0 B", "dn_cache_hit_ratio": "0", "indexes": [ { "name": "objectclass.db", "cachehit": "11", "cachemiss": "2", "pagein": "2", "pageout": "0" }, { "name": "entryrdn.db", "cachehit": "5", "cachemiss": "2", "pagein": "2", "pageout": "0" }, { "name": "id2entry.db", "cachehit": "2", "cachemiss": "2", "pagein": "2", "pageout": "0" }, { "name": "aci.db", "cachehit": "1", "cachemiss": "2", "pagein": "2", "pageout": "0" } ] } } } INFO  tests.suites.clu.dbmon_test:dbmon_test.py:133 Clear the log
Passed suites/clu/dbverify_test.py::test_dsctl_dbverify 2.53
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
------------------------------Captured stderr call------------------------------
[20/Oct/2020:19:28:52.838524894 -0400] - INFO - ldbm_instance_config_cachememsize_set - force a minimal value 512000
-------------------------------Captured log call--------------------------------
INFO  tests.suites.clu.dbverify_test:dbverify_test.py:63 Run dbverify INFO  tests.suites.clu.dbverify_test:dbtasks.py:88 dbverify successful INFO  tests.suites.clu.dbverify_test:dbverify_test.py:67 Check dbverify was successful
Passed suites/clu/dsidm_config_test.py::test_dsidm_config_sssd 4.77
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
------------------------------Captured stdout call------------------------------
# # sssd.conf # Generated by 389 Directory Server - dsidm # # For more details see man sssd.conf and man sssd-ldap # Be sure to review the content of this file to ensure it is secure and correct # in your environment. [domain/ldap] # Uncomment this for more verbose logging. # debug_level=3 # Cache hashes of user authentication for offline auth. cache_credentials = True id_provider = ldap auth_provider = ldap access_provider = ldap chpass_provider = ldap ldap_schema = rfc2307 ldap_search_base = dc=example,dc=com ldap_uri = ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:38901 # If you have DNS SRV records, you can use the following instead. This derives # from your ldap_search_base. # ldap_uri = _srv_ ldap_tls_reqcert = demand # To use cacert dir, place *.crt files in this path then run: # /usr/bin/openssl rehash /etc/openldap/certs # or (for older versions of openssl) # /usr/bin/c_rehash /etc/openldap/certs ldap_tls_cacertdir = /etc/openldap/certs # Path to the cacert # ldap_tls_cacert = /etc/openldap/certs/ca.crt # Only users who match this filter can login and authorise to this machine. Note # that users who do NOT match, will still have their uid/gid resolve, but they # can't login. # ldap_access_filter = (memberOf=<dn>) enumerate = false access_provider = ldap ldap_user_member_of = memberof ldap_user_gecos = cn ldap_user_uuid = nsUniqueId ldap_group_uuid = nsUniqueId # This is really important as it allows SSSD to respect nsAccountLock ldap_account_expire_policy = rhds ldap_access_order = filter, expire # Setup for ssh keys # Inside /etc/ssh/sshd_config add the lines: # AuthorizedKeysCommand /usr/bin/sss_ssh_authorizedkeys # AuthorizedKeysCommandUser nobody # You can test with the command: sss_ssh_authorizedkeys <username> ldap_user_ssh_public_key = nsSshPublicKey # This prevents an issue where the Directory is recursively walked on group # and user look ups. It makes the client faster and more responsive in almost # every scenario. ignore_group_members = False [sssd] services = nss, pam, ssh, sudo config_file_version = 2 domains = ldap [nss] homedir_substring = /home # # sssd.conf # Generated by 389 Directory Server - dsidm # # For more details see man sssd.conf and man sssd-ldap # Be sure to review the content of this file to ensure it is secure and correct # in your environment. [domain/ldap] # Uncomment this for more verbose logging. # debug_level=3 # Cache hashes of user authentication for offline auth. cache_credentials = True id_provider = ldap auth_provider = ldap access_provider = ldap chpass_provider = ldap ldap_schema = rfc2307bis ldap_search_base = dc=example,dc=com ldap_uri = ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:38901 # If you have DNS SRV records, you can use the following instead. This derives # from your ldap_search_base. # ldap_uri = _srv_ ldap_tls_reqcert = demand # To use cacert dir, place *.crt files in this path then run: # /usr/bin/openssl rehash /etc/openldap/certs # or (for older versions of openssl) # /usr/bin/c_rehash /etc/openldap/certs ldap_tls_cacertdir = /etc/openldap/certs # Path to the cacert # ldap_tls_cacert = /etc/openldap/certs/ca.crt # Only users who match this filter can login and authorise to this machine. Note # that users who do NOT match, will still have their uid/gid resolve, but they # can't login. ldap_access_filter = (memberOf=cn=new_group,ou=Groups,dc=example,dc=com) enumerate = false access_provider = ldap ldap_user_member_of = memberof ldap_user_gecos = cn ldap_user_uuid = nsUniqueId ldap_group_uuid = nsUniqueId # This is really important as it allows SSSD to respect nsAccountLock ldap_account_expire_policy = rhds ldap_access_order = filter, expire # Setup for ssh keys # Inside /etc/ssh/sshd_config add the lines: # AuthorizedKeysCommand /usr/bin/sss_ssh_authorizedkeys # AuthorizedKeysCommandUser nobody # You can test with the command: sss_ssh_authorizedkeys <username> ldap_user_ssh_public_key = nsSshPublicKey # This prevents an issue where the Directory is recursively walked on group # and user look ups. It makes the client faster and more responsive in almost # every scenario. ignore_group_members = False [sssd] services = nss, pam, ssh, sudo config_file_version = 2 domains = ldap [nss] homedir_substring = /home
-------------------------------Captured log call--------------------------------
INFO  tests.suites.clu.dsidm_config_test:dsidm_config_test.py:102 Create sssd.conf content DEBUG  tests.suites.clu.dsidm_config_test:client_config.py:114 # # sssd.conf # Generated by 389 Directory Server - dsidm # # For more details see man sssd.conf and man sssd-ldap # Be sure to review the content of this file to ensure it is secure and correct # in your environment. [domain/ldap] # Uncomment this for more verbose logging. # debug_level=3 # Cache hashes of user authentication for offline auth. cache_credentials = True id_provider = ldap auth_provider = ldap access_provider = ldap chpass_provider = ldap ldap_schema = rfc2307 ldap_search_base = dc=example,dc=com ldap_uri = ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:38901 # If you have DNS SRV records, you can use the following instead. This derives # from your ldap_search_base. # ldap_uri = _srv_ ldap_tls_reqcert = demand # To use cacert dir, place *.crt files in this path then run: # /usr/bin/openssl rehash /etc/openldap/certs # or (for older versions of openssl) # /usr/bin/c_rehash /etc/openldap/certs ldap_tls_cacertdir = /etc/openldap/certs # Path to the cacert # ldap_tls_cacert = /etc/openldap/certs/ca.crt # Only users who match this filter can login and authorise to this machine. Note # that users who do NOT match, will still have their uid/gid resolve, but they # can't login. # ldap_access_filter = (memberOf=<dn>) enumerate = false access_provider = ldap ldap_user_member_of = memberof ldap_user_gecos = cn ldap_user_uuid = nsUniqueId ldap_group_uuid = nsUniqueId # This is really important as it allows SSSD to respect nsAccountLock ldap_account_expire_policy = rhds ldap_access_order = filter, expire # Setup for ssh keys # Inside /etc/ssh/sshd_config add the lines: # AuthorizedKeysCommand /usr/bin/sss_ssh_authorizedkeys # AuthorizedKeysCommandUser nobody # You can test with the command: sss_ssh_authorizedkeys <username> ldap_user_ssh_public_key = nsSshPublicKey # This prevents an issue where the Directory is recursively walked on group # and user look ups. It makes the client faster and more responsive in almost # every scenario. ignore_group_members = False [sssd] services = nss, pam, ssh, sudo config_file_version = 2 domains = ldap [nss] homedir_substring = /home INFO  tests.suites.clu.dsidm_config_test:dsidm_config_test.py:105 Check if config creation was successful INFO  tests.suites.clu.dsidm_config_test:dsidm_config_test.py:52 Check if content is present in output INFO  tests.suites.clu.dsidm_config_test:dsidm_config_test.py:57 Check if value is present in output INFO  tests.suites.clu.dsidm_config_test:dsidm_config_test.py:60 Reset log file for next test INFO  tests.suites.clu.dsidm_config_test:dsidm_config_test.py:108 Now we test allowed_group argument INFO  tests.suites.clu.dsidm_config_test:dsidm_config_test.py:109 Enable MemberOf plugin INFO  tests.suites.clu.dsidm_config_test:dsidm_config_test.py:114 Create test group INFO  tests.suites.clu.dsidm_config_test:dsidm_config_test.py:119 Create sssd.conf content with allowed group DEBUG  tests.suites.clu.dsidm_config_test:client_config.py:114 # # sssd.conf # Generated by 389 Directory Server - dsidm # # For more details see man sssd.conf and man sssd-ldap # Be sure to review the content of this file to ensure it is secure and correct # in your environment. [domain/ldap] # Uncomment this for more verbose logging. # debug_level=3 # Cache hashes of user authentication for offline auth. cache_credentials = True id_provider = ldap auth_provider = ldap access_provider = ldap chpass_provider = ldap ldap_schema = rfc2307bis ldap_search_base = dc=example,dc=com ldap_uri = ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:38901 # If you have DNS SRV records, you can use the following instead. This derives # from your ldap_search_base. # ldap_uri = _srv_ ldap_tls_reqcert = demand # To use cacert dir, place *.crt files in this path then run: # /usr/bin/openssl rehash /etc/openldap/certs # or (for older versions of openssl) # /usr/bin/c_rehash /etc/openldap/certs ldap_tls_cacertdir = /etc/openldap/certs # Path to the cacert # ldap_tls_cacert = /etc/openldap/certs/ca.crt # Only users who match this filter can login and authorise to this machine. Note # that users who do NOT match, will still have their uid/gid resolve, but they # can't login. ldap_access_filter = (memberOf=cn=new_group,ou=Groups,dc=example,dc=com) enumerate = false access_provider = ldap ldap_user_member_of = memberof ldap_user_gecos = cn ldap_user_uuid = nsUniqueId ldap_group_uuid = nsUniqueId # This is really important as it allows SSSD to respect nsAccountLock ldap_account_expire_policy = rhds ldap_access_order = filter, expire # Setup for ssh keys # Inside /etc/ssh/sshd_config add the lines: # AuthorizedKeysCommand /usr/bin/sss_ssh_authorizedkeys # AuthorizedKeysCommandUser nobody # You can test with the command: sss_ssh_authorizedkeys <username> ldap_user_ssh_public_key = nsSshPublicKey # This prevents an issue where the Directory is recursively walked on group # and user look ups. It makes the client faster and more responsive in almost # every scenario. ignore_group_members = False [sssd] services = nss, pam, ssh, sudo config_file_version = 2 domains = ldap [nss] homedir_substring = /home INFO  tests.suites.clu.dsidm_config_test:dsidm_config_test.py:124 Check if config creation was successful INFO  tests.suites.clu.dsidm_config_test:dsidm_config_test.py:48 Check if content is present in output INFO  tests.suites.clu.dsidm_config_test:dsidm_config_test.py:60 Reset log file for next test
Passed suites/clu/dsidm_config_test.py::test_dsidm_config_ldap 0.00
------------------------------Captured stdout call------------------------------
# # OpenLDAP client configuration # Generated by 389 Directory Server - dsidm # # See ldap.conf(5) for details # This file should be world readable but not world writable. BASE dc=example,dc=com # Remember to check this: you can have multiple uris on this line. You may have # multiple servers or load balancers in your environment. URI ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:38901 # If you have DNS SRV records you can use: # URI ldaps:///dc%3Dexample%2Cdc%3Dcom DEREF never # To use cacert dir, place *.crt files in this path then run: # /usr/bin/openssl rehash /etc/openldap/certs # or (for older versions of openssl) # /usr/bin/c_rehash /etc/openldap/certs TLS_CACERTDIR /etc/openldap/certs # TLS_CACERT /etc/openldap/certs/ca.crt
-------------------------------Captured log call--------------------------------
INFO  tests.suites.clu.dsidm_config_test:dsidm_config_test.py:152 Create ldap.conf content DEBUG  tests.suites.clu.dsidm_config_test:client_config.py:155 # # OpenLDAP client configuration # Generated by 389 Directory Server - dsidm # # See ldap.conf(5) for details # This file should be world readable but not world writable. BASE dc=example,dc=com # Remember to check this: you can have multiple uris on this line. You may have # multiple servers or load balancers in your environment. URI ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:38901 # If you have DNS SRV records you can use: # URI ldaps:///dc%3Dexample%2Cdc%3Dcom DEREF never # To use cacert dir, place *.crt files in this path then run: # /usr/bin/openssl rehash /etc/openldap/certs # or (for older versions of openssl) # /usr/bin/c_rehash /etc/openldap/certs TLS_CACERTDIR /etc/openldap/certs # TLS_CACERT /etc/openldap/certs/ca.crt INFO  tests.suites.clu.dsidm_config_test:dsidm_config_test.py:155 Check if config creation was successful INFO  tests.suites.clu.dsidm_config_test:dsidm_config_test.py:52 Check if content is present in output INFO  tests.suites.clu.dsidm_config_test:dsidm_config_test.py:60 Reset log file for next test
Passed suites/clu/dsidm_config_test.py::test_dsidm_config_display 4.23
------------------------------Captured stdout call------------------------------
# This is a generic list of LDAP client configuration parameters you may require # for connecting a client to this server. Some of them may or may not apply # to your application, so consult your application documentation for further # assistance. # # This program makes a number of assumptions about your data and configuration # which may not be correct. Be sure to check these values for your situation. ; ldap uri ; This is the uri of the server you will connect to and authenticate to. It ; must be a valid subjectAltName in the presented TLS certificate. Note that this ; is not an exhaustive list of your LDAP servers, and other applications in your ; network like load balancers may affect this. This is just what we derive from ; your current connection. ldap_uri = ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:38901 ; ldap dns discovery uri ; In some environments, you may have DNS SRV records such as ; "_ldap._tcp.<domain name>". If these are present in your dns server, you can ; use the following uri. ldap_uri = ldaps:///dc%3Dexample%2Cdc%3Dcom ; ca_cert ; To correctly use TLS, you require the valid CA cert that issued your LDAP TLS ; certificates. Sometimes a copy of this may be in your server instance as ca_cert = /etc/dirsrv/slapd-<instance>/ca.crt ; However that's not guaranteed. You can show the certs from the LDAP server ; by sshing to the server and running: certutil -L -d /etc/dirsrv/slapd-<instance>/ ; If you can identify the CA certificate name, you can then view it with: certutil -L -n <ca cert name> -a -d /etc/dirsrv/slapd-<instance>/ ; This should be a pem file you can use in your application's CA. ; Some applications don't require a ca certificate parameter, and will use the ; ca certificate from /etc/openldap/ldap.conf. You should configure ldap.conf ; in these cases. See the 'client_config ldap.conf' command in dsidm. ; basedn ; The basedn is the root suffix where all searches will originate from for ; LDAP objects. basedn = dc=example,dc=com ; schema_type ; LDAP servers have different ways to structure their objects and group ; relationships. Legacy servers will use rfc2307, where as modern servers will ; use rfc2307bis (requires MemberOf plugin to be enabled). This is the schema ; setting of your directory based on your running configuration (if we can ; detect it). schema_type = rfc2307bis ; user/account basedn ; Some applications may optionally use a user/account basedn to limit searches ; in the directory. This can be for performance or security reasons. Generally ; you shouldn't need this, preferring to use groups and filters for access ; control. user_basedn = ou=people,dc=example,dc=com ; user filter ; This is an ldap filter that will return only user objects. Additionally some ; applications will template into the filter (similar to sql statements) or they ; will generate the filter based on attributes. We list a number of possible ; filters you might use, but you should customise this for your application. ; ; If you are using rfc2307bis, you can use this filter to provide authorisation ; support by adding filters such as: (memberOf=<groupdn>) user_filter = (&(objectclass=nsPerson)(objectclass=nsAccount)(objectclass=nsOrgPerson)(objectclass=posixAccount)) user_filter = (&(&(objectclass=nsPerson)(objectclass=nsAccount)(objectclass=nsOrgPerson)(objectclass=posixAccount))(|(uid=<PARAM>)(displayName=<PARAM>)(cn=<PARAM>))) ; group basedn ; Some applications may optionnaly use a group basedn to limit searches in the ; directory. This can be for performance or security reasons. Generally you ; shouldn't need this, preferring to use groups and filters for access control. group_basedn = ou=Groups,dc=example,dc=com ; group filter ; This is an ldap filter that will return only group objects. Additionally ; some applications will template into the filter (similar to sql statements) ; or they will generate the filter base on attributes. We list a number of ; possible filters you might use, but you should customise this for your ; application. group_filter = (&(objectclass=groupOfNames)) group_filter = (&(&(objectclass=groupOfNames))(|(cn=<PARAM>))) ; attribute mappings ; Due to the variety of schemas and attribute mappings in LDAP, there are ; different representations of attributes and values. This is a guess at ; the mappings that exist in your server, and what attributes you should ; configure and use. unique id = nsUniqueId user rdn = uid user identifier = uid group rdn = cn group member attribute = member # This is a generic list of LDAP client configuration parameters you may require # for connecting a client to this server. Some of them may or may not apply # to your application, so consult your application documentation for further # assistance. # # This program makes a number of assumptions about your data and configuration # which may not be correct. Be sure to check these values for your situation. ; ldap uri ; This is the uri of the server you will connect to and authenticate to. It ; must be a valid subjectAltName in the presented TLS certificate. Note that this ; is not an exhaustive list of your LDAP servers, and other applications in your ; network like load balancers may affect this. This is just what we derive from ; your current connection. ldap_uri = ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:38901 ; ldap dns discovery uri ; In some environments, you may have DNS SRV records such as ; "_ldap._tcp.<domain name>". If these are present in your dns server, you can ; use the following uri. ldap_uri = ldaps:///dc%3Dexample%2Cdc%3Dcom ; ca_cert ; To correctly use TLS, you require the valid CA cert that issued your LDAP TLS ; certificates. Sometimes a copy of this may be in your server instance as ca_cert = /etc/dirsrv/slapd-<instance>/ca.crt ; However that's not guaranteed. You can show the certs from the LDAP server ; by sshing to the server and running: certutil -L -d /etc/dirsrv/slapd-<instance>/ ; If you can identify the CA certificate name, you can then view it with: certutil -L -n <ca cert name> -a -d /etc/dirsrv/slapd-<instance>/ ; This should be a pem file you can use in your application's CA. ; Some applications don't require a ca certificate parameter, and will use the ; ca certificate from /etc/openldap/ldap.conf. You should configure ldap.conf ; in these cases. See the 'client_config ldap.conf' command in dsidm. ; basedn ; The basedn is the root suffix where all searches will originate from for ; LDAP objects. basedn = dc=example,dc=com ; schema_type ; LDAP servers have different ways to structure their objects and group ; relationships. Legacy servers will use rfc2307, where as modern servers will ; use rfc2307bis (requires MemberOf plugin to be enabled). This is the schema ; setting of your directory based on your running configuration (if we can ; detect it). schema_type = rfc2307bis ; user/account basedn ; Some applications may optionally use a user/account basedn to limit searches ; in the directory. This can be for performance or security reasons. Generally ; you shouldn't need this, preferring to use groups and filters for access ; control. user_basedn = ou=people,dc=example,dc=com ; user filter ; This is an ldap filter that will return only user objects. Additionally some ; applications will template into the filter (similar to sql statements) or they ; will generate the filter based on attributes. We list a number of possible ; filters you might use, but you should customise this for your application. ; ; If you are using rfc2307bis, you can use this filter to provide authorisation ; support by adding filters such as: (memberOf=<groupdn>) user_filter = (&(objectclass=nsPerson)(objectclass=nsAccount)(objectclass=nsOrgPerson)(objectclass=posixAccount)) user_filter = (&(&(objectclass=nsPerson)(objectclass=nsAccount)(objectclass=nsOrgPerson)(objectclass=posixAccount))(|(uid=<PARAM>)(displayName=<PARAM>)(cn=<PARAM>))) ; group basedn ; Some applications may optionnaly use a group basedn to limit searches in the ; directory. This can be for performance or security reasons. Generally you ; shouldn't need this, preferring to use groups and filters for access control. group_basedn = ou=Groups,dc=example,dc=com ; group filter ; This is an ldap filter that will return only group objects. Additionally ; some applications will template into the filter (similar to sql statements) ; or they will generate the filter base on attributes. We list a number of ; possible filters you might use, but you should customise this for your ; application. group_filter = (&(objectclass=groupOfNames)) group_filter = (&(&(objectclass=groupOfNames))(|(cn=<PARAM>))) ; attribute mappings ; Due to the variety of schemas and attribute mappings in LDAP, there are ; different representations of attributes and values. This is a guess at ; the mappings that exist in your server, and what attributes you should ; configure and use. unique id = nsUniqueId user rdn = uid user identifier = uid group rdn = cn group member attribute = member
-------------------------------Captured log call--------------------------------
INFO  tests.suites.clu.dsidm_config_test:dsidm_config_test.py:196 Test dsidm display option DEBUG  tests.suites.clu.dsidm_config_test:client_config.py:290 # This is a generic list of LDAP client configuration parameters you may require # for connecting a client to this server. Some of them may or may not apply # to your application, so consult your application documentation for further # assistance. # # This program makes a number of assumptions about your data and configuration # which may not be correct. Be sure to check these values for your situation. ; ldap uri ; This is the uri of the server you will connect to and authenticate to. It ; must be a valid subjectAltName in the presented TLS certificate. Note that this ; is not an exhaustive list of your LDAP servers, and other applications in your ; network like load balancers may affect this. This is just what we derive from ; your current connection. ldap_uri = ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:38901 ; ldap dns discovery uri ; In some environments, you may have DNS SRV records such as ; "_ldap._tcp.<domain name>". If these are present in your dns server, you can ; use the following uri. ldap_uri = ldaps:///dc%3Dexample%2Cdc%3Dcom ; ca_cert ; To correctly use TLS, you require the valid CA cert that issued your LDAP TLS ; certificates. Sometimes a copy of this may be in your server instance as ca_cert = /etc/dirsrv/slapd-<instance>/ca.crt ; However that's not guaranteed. You can show the certs from the LDAP server ; by sshing to the server and running: certutil -L -d /etc/dirsrv/slapd-<instance>/ ; If you can identify the CA certificate name, you can then view it with: certutil -L -n <ca cert name> -a -d /etc/dirsrv/slapd-<instance>/ ; This should be a pem file you can use in your application's CA. ; Some applications don't require a ca certificate parameter, and will use the ; ca certificate from /etc/openldap/ldap.conf. You should configure ldap.conf ; in these cases. See the 'client_config ldap.conf' command in dsidm. ; basedn ; The basedn is the root suffix where all searches will originate from for ; LDAP objects. basedn = dc=example,dc=com ; schema_type ; LDAP servers have different ways to structure their objects and group ; relationships. Legacy servers will use rfc2307, where as modern servers will ; use rfc2307bis (requires MemberOf plugin to be enabled). This is the schema ; setting of your directory based on your running configuration (if we can ; detect it). schema_type = rfc2307bis ; user/account basedn ; Some applications may optionally use a user/account basedn to limit searches ; in the directory. This can be for performance or security reasons. Generally ; you shouldn't need this, preferring to use groups and filters for access ; control. user_basedn = ou=people,dc=example,dc=com ; user filter ; This is an ldap filter that will return only user objects. Additionally some ; applications will template into the filter (similar to sql statements) or they ; will generate the filter based on attributes. We list a number of possible ; filters you might use, but you should customise this for your application. ; ; If you are using rfc2307bis, you can use this filter to provide authorisation ; support by adding filters such as: (memberOf=<groupdn>) user_filter = (&(objectclass=nsPerson)(objectclass=nsAccount)(objectclass=nsOrgPerson)(objectclass=posixAccount)) user_filter = (&(&(objectclass=nsPerson)(objectclass=nsAccount)(objectclass=nsOrgPerson)(objectclass=posixAccount))(|(uid=<PARAM>)(displayName=<PARAM>)(cn=<PARAM>))) ; group basedn ; Some applications may optionnaly use a group basedn to limit searches in the ; directory. This can be for performance or security reasons. Generally you ; shouldn't need this, preferring to use groups and filters for access control. group_basedn = ou=Groups,dc=example,dc=com ; group filter ; This is an ldap filter that will return only group objects. Additionally ; some applications will template into the filter (similar to sql statements) ; or they will generate the filter base on attributes. We list a number of ; possible filters you might use, but you should customise this for your ; application. group_filter = (&(objectclass=groupOfNames)) group_filter = (&(&(objectclass=groupOfNames))(|(cn=<PARAM>))) ; attribute mappings ; Due to the variety of schemas and attribute mappings in LDAP, there are ; different representations of attributes and values. This is a guess at ; the mappings that exist in your server, and what attributes you should ; configure and use. unique id = nsUniqueId user rdn = uid user identifier = uid group rdn = cn group member attribute = member INFO  tests.suites.clu.dsidm_config_test:dsidm_config_test.py:199 Check if display option was successful INFO  tests.suites.clu.dsidm_config_test:dsidm_config_test.py:52 Check if content is present in output INFO  tests.suites.clu.dsidm_config_test:dsidm_config_test.py:57 Check if value is present in output INFO  tests.suites.clu.dsidm_config_test:dsidm_config_test.py:60 Reset log file for next test INFO  tests.suites.clu.dsidm_config_test:dsidm_config_test.py:202 Enable MemberOf plugin INFO  tests.suites.clu.dsidm_config_test:dsidm_config_test.py:207 Test dsidm display option with MemberOf plugin DEBUG  tests.suites.clu.dsidm_config_test:client_config.py:290 # This is a generic list of LDAP client configuration parameters you may require # for connecting a client to this server. Some of them may or may not apply # to your application, so consult your application documentation for further # assistance. # # This program makes a number of assumptions about your data and configuration # which may not be correct. Be sure to check these values for your situation. ; ldap uri ; This is the uri of the server you will connect to and authenticate to. It ; must be a valid subjectAltName in the presented TLS certificate. Note that this ; is not an exhaustive list of your LDAP servers, and other applications in your ; network like load balancers may affect this. This is just what we derive from ; your current connection. ldap_uri = ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:38901 ; ldap dns discovery uri ; In some environments, you may have DNS SRV records such as ; "_ldap._tcp.<domain name>". If these are present in your dns server, you can ; use the following uri. ldap_uri = ldaps:///dc%3Dexample%2Cdc%3Dcom ; ca_cert ; To correctly use TLS, you require the valid CA cert that issued your LDAP TLS ; certificates. Sometimes a copy of this may be in your server instance as ca_cert = /etc/dirsrv/slapd-<instance>/ca.crt ; However that's not guaranteed. You can show the certs from the LDAP server ; by sshing to the server and running: certutil -L -d /etc/dirsrv/slapd-<instance>/ ; If you can identify the CA certificate name, you can then view it with: certutil -L -n <ca cert name> -a -d /etc/dirsrv/slapd-<instance>/ ; This should be a pem file you can use in your application's CA. ; Some applications don't require a ca certificate parameter, and will use the ; ca certificate from /etc/openldap/ldap.conf. You should configure ldap.conf ; in these cases. See the 'client_config ldap.conf' command in dsidm. ; basedn ; The basedn is the root suffix where all searches will originate from for ; LDAP objects. basedn = dc=example,dc=com ; schema_type ; LDAP servers have different ways to structure their objects and group ; relationships. Legacy servers will use rfc2307, where as modern servers will ; use rfc2307bis (requires MemberOf plugin to be enabled). This is the schema ; setting of your directory based on your running configuration (if we can ; detect it). schema_type = rfc2307bis ; user/account basedn ; Some applications may optionally use a user/account basedn to limit searches ; in the directory. This can be for performance or security reasons. Generally ; you shouldn't need this, preferring to use groups and filters for access ; control. user_basedn = ou=people,dc=example,dc=com ; user filter ; This is an ldap filter that will return only user objects. Additionally some ; applications will template into the filter (similar to sql statements) or they ; will generate the filter based on attributes. We list a number of possible ; filters you might use, but you should customise this for your application. ; ; If you are using rfc2307bis, you can use this filter to provide authorisation ; support by adding filters such as: (memberOf=<groupdn>) user_filter = (&(objectclass=nsPerson)(objectclass=nsAccount)(objectclass=nsOrgPerson)(objectclass=posixAccount)) user_filter = (&(&(objectclass=nsPerson)(objectclass=nsAccount)(objectclass=nsOrgPerson)(objectclass=posixAccount))(|(uid=<PARAM>)(displayName=<PARAM>)(cn=<PARAM>))) ; group basedn ; Some applications may optionnaly use a group basedn to limit searches in the ; directory. This can be for performance or security reasons. Generally you ; shouldn't need this, preferring to use groups and filters for access control. group_basedn = ou=Groups,dc=example,dc=com ; group filter ; This is an ldap filter that will return only group objects. Additionally ; some applications will template into the filter (similar to sql statements) ; or they will generate the filter base on attributes. We list a number of ; possible filters you might use, but you should customise this for your ; application. group_filter = (&(objectclass=groupOfNames)) group_filter = (&(&(objectclass=groupOfNames))(|(cn=<PARAM>))) ; attribute mappings ; Due to the variety of schemas and attribute mappings in LDAP, there are ; different representations of attributes and values. This is a guess at ; the mappings that exist in your server, and what attributes you should ; configure and use. unique id = nsUniqueId user rdn = uid user identifier = uid group rdn = cn group member attribute = member INFO  tests.suites.clu.dsidm_config_test:dsidm_config_test.py:210 Check if display option was successful with MemberOf plugin enabled INFO  tests.suites.clu.dsidm_config_test:dsidm_config_test.py:52 Check if content is present in output INFO  tests.suites.clu.dsidm_config_test:dsidm_config_test.py:57 Check if value is present in output INFO  tests.suites.clu.dsidm_config_test:dsidm_config_test.py:60 Reset log file for next test
Passed suites/clu/fixup_test.py::test_posix_winsync_fixup 8.39
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.clu.fixup_test:fixup_test.py:73 Enable POSIXWinsyncPlugin INFO  tests.suites.clu.fixup_test:fixup_test.py:77 Stopping the server and importing posix accounts INFO  tests.suites.clu.fixup_test:fixup_test.py:87 Run Fixup task INFO  tests.suites.clu.fixup_test:posix_winsync.py:29 Attempting to add task entry... INFO  tests.suites.clu.fixup_test:posix_winsync.py:39 Successfully added task entry INFO  tests.suites.clu.fixup_test:fixup_test.py:90 Check log if fixup task was successful
Passed suites/config/autotuning_test.py::test_threads_basic 0.02
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.config.autotuning_test:autotuning_test.py:39 Set nsslapd-threadnumber: -1 to enable autotuning INFO  tests.suites.config.autotuning_test:autotuning_test.py:42 Assert nsslapd-threadnumber is equal to the documented expected value
Passed suites/config/autotuning_test.py::test_threads_warning 1.31
No log output captured.
Passed suites/config/autotuning_test.py::test_threads_invalid_value[-2] 0.02
-------------------------------Captured log call--------------------------------
INFO  tests.suites.config.autotuning_test:autotuning_test.py:87 Set nsslapd-threadnumber: -2. Operation should fail
Passed suites/config/autotuning_test.py::test_threads_invalid_value[0] 0.01
-------------------------------Captured log call--------------------------------
INFO  tests.suites.config.autotuning_test:autotuning_test.py:87 Set nsslapd-threadnumber: 0. Operation should fail
Passed suites/config/autotuning_test.py::test_threads_invalid_value[invalid] 0.01
-------------------------------Captured log call--------------------------------
INFO  tests.suites.config.autotuning_test:autotuning_test.py:87 Set nsslapd-threadnumber: invalid. Operation should fail
Passed suites/config/autotuning_test.py::test_threads_back_from_manual_value 0.05
-------------------------------Captured log call--------------------------------
INFO  tests.suites.config.autotuning_test:autotuning_test.py:109 Set nsslapd-threadnumber: -1 to enable autotuning and save the new value INFO  tests.suites.config.autotuning_test:autotuning_test.py:113 Set nsslapd-threadnumber to the autotuned value decreased by 2 INFO  tests.suites.config.autotuning_test:autotuning_test.py:118 Set nsslapd-threadnumber: -1 to enable autotuning INFO  tests.suites.config.autotuning_test:autotuning_test.py:121 Assert nsslapd-threadnumber is back to the autotuned value
Passed suites/config/autotuning_test.py::test_cache_autosize_non_zero[-] 5.16
-------------------------------Captured log call--------------------------------
INFO  tests.suites.config.autotuning_test:autotuning_test.py:169 Check nsslapd-dbcachesize and nsslapd-cachememsize before the test INFO  tests.suites.config.autotuning_test:autotuning_test.py:170 nsslapd-dbcachesize == b'408620032' INFO  tests.suites.config.autotuning_test:autotuning_test.py:171 nsslapd-cachememsize == b'1409286144' INFO  tests.suites.config.autotuning_test:autotuning_test.py:172 nsslapd-dncachememsize == b'201326592' INFO  tests.suites.config.autotuning_test:autotuning_test.py:173 nsslapd-cache-autosize == b'25' INFO  tests.suites.config.autotuning_test:autotuning_test.py:174 nsslapd-cache-autosize-split == b'25' INFO  tests.suites.config.autotuning_test:autotuning_test.py:180 Delete nsslapd-cache-autosize INFO  tests.suites.config.autotuning_test:autotuning_test.py:190 Delete nsslapd-cache-autosize-split INFO  tests.suites.config.autotuning_test:autotuning_test.py:196 Trying to set nsslapd-cachememsize to 33333333 INFO  tests.suites.config.autotuning_test:autotuning_test.py:199 Trying to set nsslapd-dbcachesize to 33333333 INFO  tests.suites.config.autotuning_test:autotuning_test.py:216 Check nsslapd-dbcachesize and nsslapd-cachememsize in the appropriate range. INFO  tests.suites.config.autotuning_test:autotuning_test.py:217 nsslapd-dbcachesize == b'408620032' INFO  tests.suites.config.autotuning_test:autotuning_test.py:218 nsslapd-cachememsize == b'1409286144' INFO  tests.suites.config.autotuning_test:autotuning_test.py:219 nsslapd-dncachememsize == b'201326592' INFO  tests.suites.config.autotuning_test:autotuning_test.py:220 nsslapd-cache-autosize == b'25' INFO  tests.suites.config.autotuning_test:autotuning_test.py:221 nsslapd-cache-autosize-split == b'25'
Passed suites/config/autotuning_test.py::test_cache_autosize_non_zero[-0] 4.80
-------------------------------Captured log call--------------------------------
INFO  tests.suites.config.autotuning_test:autotuning_test.py:169 Check nsslapd-dbcachesize and nsslapd-cachememsize before the test INFO  tests.suites.config.autotuning_test:autotuning_test.py:170 nsslapd-dbcachesize == b'408620032' INFO  tests.suites.config.autotuning_test:autotuning_test.py:171 nsslapd-cachememsize == b'1409286144' INFO  tests.suites.config.autotuning_test:autotuning_test.py:172 nsslapd-dncachememsize == b'201326592' INFO  tests.suites.config.autotuning_test:autotuning_test.py:173 nsslapd-cache-autosize == b'25' INFO  tests.suites.config.autotuning_test:autotuning_test.py:174 nsslapd-cache-autosize-split == b'25' INFO  tests.suites.config.autotuning_test:autotuning_test.py:180 Delete nsslapd-cache-autosize INFO  tests.suites.config.autotuning_test:autotuning_test.py:187 Set nsslapd-cache-autosize-split to 0 INFO  tests.suites.config.autotuning_test:autotuning_test.py:196 Trying to set nsslapd-cachememsize to 33333333 INFO  tests.suites.config.autotuning_test:autotuning_test.py:199 Trying to set nsslapd-dbcachesize to 33333333 INFO  tests.suites.config.autotuning_test:autotuning_test.py:216 Check nsslapd-dbcachesize and nsslapd-cachememsize in the appropriate range. INFO  tests.suites.config.autotuning_test:autotuning_test.py:217 nsslapd-dbcachesize == b'408620032' INFO  tests.suites.config.autotuning_test:autotuning_test.py:218 nsslapd-cachememsize == b'1409286144' INFO  tests.suites.config.autotuning_test:autotuning_test.py:219 nsslapd-dncachememsize == b'201326592' INFO  tests.suites.config.autotuning_test:autotuning_test.py:220 nsslapd-cache-autosize == b'25' INFO  tests.suites.config.autotuning_test:autotuning_test.py:221 nsslapd-cache-autosize-split == b'0'
Passed suites/config/autotuning_test.py::test_cache_autosize_non_zero[10-400] 4.65
-------------------------------Captured log call--------------------------------
INFO  tests.suites.config.autotuning_test:autotuning_test.py:169 Check nsslapd-dbcachesize and nsslapd-cachememsize before the test INFO  tests.suites.config.autotuning_test:autotuning_test.py:170 nsslapd-dbcachesize == b'408620032' INFO  tests.suites.config.autotuning_test:autotuning_test.py:171 nsslapd-cachememsize == b'1409286144' INFO  tests.suites.config.autotuning_test:autotuning_test.py:172 nsslapd-dncachememsize == b'201326592' INFO  tests.suites.config.autotuning_test:autotuning_test.py:173 nsslapd-cache-autosize == b'25' INFO  tests.suites.config.autotuning_test:autotuning_test.py:174 nsslapd-cache-autosize-split == b'0' INFO  tests.suites.config.autotuning_test:autotuning_test.py:177 Set nsslapd-cache-autosize to 10 INFO  tests.suites.config.autotuning_test:autotuning_test.py:187 Set nsslapd-cache-autosize-split to 40 INFO  tests.suites.config.autotuning_test:autotuning_test.py:196 Trying to set nsslapd-cachememsize to 33333333 INFO  tests.suites.config.autotuning_test:autotuning_test.py:199 Trying to set nsslapd-dbcachesize to 33333333 INFO  tests.suites.config.autotuning_test:autotuning_test.py:216 Check nsslapd-dbcachesize and nsslapd-cachememsize in the appropriate range. INFO  tests.suites.config.autotuning_test:autotuning_test.py:217 nsslapd-dbcachesize == b'261516820' INFO  tests.suites.config.autotuning_test:autotuning_test.py:218 nsslapd-cachememsize == b'469762048' INFO  tests.suites.config.autotuning_test:autotuning_test.py:219 nsslapd-dncachememsize == b'67108864' INFO  tests.suites.config.autotuning_test:autotuning_test.py:220 nsslapd-cache-autosize == b'10' INFO  tests.suites.config.autotuning_test:autotuning_test.py:221 nsslapd-cache-autosize-split == b'40'
Passed suites/config/autotuning_test.py::test_cache_autosize_non_zero[-40] 4.75
-------------------------------Captured log call--------------------------------
INFO  tests.suites.config.autotuning_test:autotuning_test.py:169 Check nsslapd-dbcachesize and nsslapd-cachememsize before the test INFO  tests.suites.config.autotuning_test:autotuning_test.py:170 nsslapd-dbcachesize == b'261516820' INFO  tests.suites.config.autotuning_test:autotuning_test.py:171 nsslapd-cachememsize == b'469762048' INFO  tests.suites.config.autotuning_test:autotuning_test.py:172 nsslapd-dncachememsize == b'67108864' INFO  tests.suites.config.autotuning_test:autotuning_test.py:173 nsslapd-cache-autosize == b'10' INFO  tests.suites.config.autotuning_test:autotuning_test.py:174 nsslapd-cache-autosize-split == b'40' INFO  tests.suites.config.autotuning_test:autotuning_test.py:180 Delete nsslapd-cache-autosize INFO  tests.suites.config.autotuning_test:autotuning_test.py:187 Set nsslapd-cache-autosize-split to 40 INFO  tests.suites.config.autotuning_test:autotuning_test.py:196 Trying to set nsslapd-cachememsize to 33333333 INFO  tests.suites.config.autotuning_test:autotuning_test.py:199 Trying to set nsslapd-dbcachesize to 33333333 INFO  tests.suites.config.autotuning_test:autotuning_test.py:216 Check nsslapd-dbcachesize and nsslapd-cachememsize in the appropriate range. INFO  tests.suites.config.autotuning_test:autotuning_test.py:217 nsslapd-dbcachesize == b'817240064' INFO  tests.suites.config.autotuning_test:autotuning_test.py:218 nsslapd-cachememsize == b'1140850688' INFO  tests.suites.config.autotuning_test:autotuning_test.py:219 nsslapd-dncachememsize == b'134217728' INFO  tests.suites.config.autotuning_test:autotuning_test.py:220 nsslapd-cache-autosize == b'25' INFO  tests.suites.config.autotuning_test:autotuning_test.py:221 nsslapd-cache-autosize-split == b'40'
Passed suites/config/autotuning_test.py::test_cache_autosize_non_zero[10-] 4.72
-------------------------------Captured log call--------------------------------
INFO  tests.suites.config.autotuning_test:autotuning_test.py:169 Check nsslapd-dbcachesize and nsslapd-cachememsize before the test INFO  tests.suites.config.autotuning_test:autotuning_test.py:170 nsslapd-dbcachesize == b'817240064' INFO  tests.suites.config.autotuning_test:autotuning_test.py:171 nsslapd-cachememsize == b'1140850688' INFO  tests.suites.config.autotuning_test:autotuning_test.py:172 nsslapd-dncachememsize == b'134217728' INFO  tests.suites.config.autotuning_test:autotuning_test.py:173 nsslapd-cache-autosize == b'25' INFO  tests.suites.config.autotuning_test:autotuning_test.py:174 nsslapd-cache-autosize-split == b'40' INFO  tests.suites.config.autotuning_test:autotuning_test.py:177 Set nsslapd-cache-autosize to 10 INFO  tests.suites.config.autotuning_test:autotuning_test.py:190 Delete nsslapd-cache-autosize-split INFO  tests.suites.config.autotuning_test:autotuning_test.py:196 Trying to set nsslapd-cachememsize to 33333333 INFO  tests.suites.config.autotuning_test:autotuning_test.py:199 Trying to set nsslapd-dbcachesize to 33333333 INFO  tests.suites.config.autotuning_test:autotuning_test.py:216 Check nsslapd-dbcachesize and nsslapd-cachememsize in the appropriate range. INFO  tests.suites.config.autotuning_test:autotuning_test.py:217 nsslapd-dbcachesize == b'163448012' INFO  tests.suites.config.autotuning_test:autotuning_test.py:218 nsslapd-cachememsize == b'603979776' INFO  tests.suites.config.autotuning_test:autotuning_test.py:219 nsslapd-dncachememsize == b'67108864' INFO  tests.suites.config.autotuning_test:autotuning_test.py:220 nsslapd-cache-autosize == b'10' INFO  tests.suites.config.autotuning_test:autotuning_test.py:221 nsslapd-cache-autosize-split == b'25'
Passed suites/config/autotuning_test.py::test_cache_autosize_non_zero[10-401] 4.37
-------------------------------Captured log call--------------------------------
INFO  tests.suites.config.autotuning_test:autotuning_test.py:169 Check nsslapd-dbcachesize and nsslapd-cachememsize before the test INFO  tests.suites.config.autotuning_test:autotuning_test.py:170 nsslapd-dbcachesize == b'163448012' INFO  tests.suites.config.autotuning_test:autotuning_test.py:171 nsslapd-cachememsize == b'603979776' INFO  tests.suites.config.autotuning_test:autotuning_test.py:172 nsslapd-dncachememsize == b'67108864' INFO  tests.suites.config.autotuning_test:autotuning_test.py:173 nsslapd-cache-autosize == b'10' INFO  tests.suites.config.autotuning_test:autotuning_test.py:174 nsslapd-cache-autosize-split == b'25' INFO  tests.suites.config.autotuning_test:autotuning_test.py:177 Set nsslapd-cache-autosize to 10 INFO  tests.suites.config.autotuning_test:autotuning_test.py:187 Set nsslapd-cache-autosize-split to 40 INFO  tests.suites.config.autotuning_test:autotuning_test.py:196 Trying to set nsslapd-cachememsize to 33333333 INFO  tests.suites.config.autotuning_test:autotuning_test.py:199 Trying to set nsslapd-dbcachesize to 33333333 INFO  tests.suites.config.autotuning_test:autotuning_test.py:216 Check nsslapd-dbcachesize and nsslapd-cachememsize in the appropriate range. INFO  tests.suites.config.autotuning_test:autotuning_test.py:217 nsslapd-dbcachesize == b'261516820' INFO  tests.suites.config.autotuning_test:autotuning_test.py:218 nsslapd-cachememsize == b'469762048' INFO  tests.suites.config.autotuning_test:autotuning_test.py:219 nsslapd-dncachememsize == b'67108864' INFO  tests.suites.config.autotuning_test:autotuning_test.py:220 nsslapd-cache-autosize == b'10' INFO  tests.suites.config.autotuning_test:autotuning_test.py:221 nsslapd-cache-autosize-split == b'40'
Passed suites/config/autotuning_test.py::test_cache_autosize_non_zero[10-0] 4.73
-------------------------------Captured log call--------------------------------
INFO  tests.suites.config.autotuning_test:autotuning_test.py:169 Check nsslapd-dbcachesize and nsslapd-cachememsize before the test INFO  tests.suites.config.autotuning_test:autotuning_test.py:170 nsslapd-dbcachesize == b'261516820' INFO  tests.suites.config.autotuning_test:autotuning_test.py:171 nsslapd-cachememsize == b'469762048' INFO  tests.suites.config.autotuning_test:autotuning_test.py:172 nsslapd-dncachememsize == b'67108864' INFO  tests.suites.config.autotuning_test:autotuning_test.py:173 nsslapd-cache-autosize == b'10' INFO  tests.suites.config.autotuning_test:autotuning_test.py:174 nsslapd-cache-autosize-split == b'40' INFO  tests.suites.config.autotuning_test:autotuning_test.py:177 Set nsslapd-cache-autosize to 10 INFO  tests.suites.config.autotuning_test:autotuning_test.py:187 Set nsslapd-cache-autosize-split to 0 INFO  tests.suites.config.autotuning_test:autotuning_test.py:196 Trying to set nsslapd-cachememsize to 33333333 INFO  tests.suites.config.autotuning_test:autotuning_test.py:199 Trying to set nsslapd-dbcachesize to 33333333 INFO  tests.suites.config.autotuning_test:autotuning_test.py:216 Check nsslapd-dbcachesize and nsslapd-cachememsize in the appropriate range. INFO  tests.suites.config.autotuning_test:autotuning_test.py:217 nsslapd-dbcachesize == b'163448012' INFO  tests.suites.config.autotuning_test:autotuning_test.py:218 nsslapd-cachememsize == b'603979776' INFO  tests.suites.config.autotuning_test:autotuning_test.py:219 nsslapd-dncachememsize == b'67108864' INFO  tests.suites.config.autotuning_test:autotuning_test.py:220 nsslapd-cache-autosize == b'10' INFO  tests.suites.config.autotuning_test:autotuning_test.py:221 nsslapd-cache-autosize-split == b'0'
Passed suites/config/autotuning_test.py::test_cache_autosize_basic_sane[0] 9.51
-------------------------------Captured log call--------------------------------
INFO  tests.suites.config.autotuning_test:autotuning_test.py:273 Check nsslapd-dbcachesize and nsslapd-cachememsize before the test INFO  tests.suites.config.autotuning_test:autotuning_test.py:274 nsslapd-dbcachesize == b'163448012' INFO  tests.suites.config.autotuning_test:autotuning_test.py:275 nsslapd-cachememsize == b'603979776' INFO  tests.suites.config.autotuning_test:autotuning_test.py:276 nsslapd-cache-autosize == b'0' INFO  tests.suites.config.autotuning_test:autotuning_test.py:277 nsslapd-cache-autosize-split == b'0' INFO  tests.suites.config.autotuning_test:autotuning_test.py:280 Set nsslapd-cache-autosize-split to 0 INFO  tests.suites.config.autotuning_test:autotuning_test.py:289 Set nsslapd-dbcachesize to 0 INFO  tests.suites.config.autotuning_test:autotuning_test.py:291 Set nsslapd-cachememsize to 0 INFO  tests.suites.config.autotuning_test:autotuning_test.py:307 Check nsslapd-dbcachesize and nsslapd-cachememsize in the appropriate range. INFO  tests.suites.config.autotuning_test:autotuning_test.py:308 nsslapd-dbcachesize == b'408620032' INFO  tests.suites.config.autotuning_test:autotuning_test.py:309 nsslapd-cachememsize == b'1409286144' INFO  tests.suites.config.autotuning_test:autotuning_test.py:310 nsslapd-dncachememsize == b'67108864' INFO  tests.suites.config.autotuning_test:autotuning_test.py:311 nsslapd-cache-autosize == b'0' INFO  tests.suites.config.autotuning_test:autotuning_test.py:312 nsslapd-cache-autosize-split == b'0' INFO  tests.suites.config.autotuning_test:autotuning_test.py:273 Check nsslapd-dbcachesize and nsslapd-cachememsize before the test INFO  tests.suites.config.autotuning_test:autotuning_test.py:274 nsslapd-dbcachesize == b'408620032' INFO  tests.suites.config.autotuning_test:autotuning_test.py:275 nsslapd-cachememsize == b'1409286144' INFO  tests.suites.config.autotuning_test:autotuning_test.py:276 nsslapd-cache-autosize == b'0' INFO  tests.suites.config.autotuning_test:autotuning_test.py:277 nsslapd-cache-autosize-split == b'0' INFO  tests.suites.config.autotuning_test:autotuning_test.py:280 Set nsslapd-cache-autosize-split to 0 INFO  tests.suites.config.autotuning_test:autotuning_test.py:289 Set nsslapd-dbcachesize to 33333333 INFO  tests.suites.config.autotuning_test:autotuning_test.py:291 Set nsslapd-cachememsize to 33333333 INFO  tests.suites.config.autotuning_test:autotuning_test.py:307 Check nsslapd-dbcachesize and nsslapd-cachememsize in the appropriate range. INFO  tests.suites.config.autotuning_test:autotuning_test.py:308 nsslapd-dbcachesize == b'33333333' INFO  tests.suites.config.autotuning_test:autotuning_test.py:309 nsslapd-cachememsize == b'33333333' INFO  tests.suites.config.autotuning_test:autotuning_test.py:310 nsslapd-dncachememsize == b'67108864' INFO  tests.suites.config.autotuning_test:autotuning_test.py:311 nsslapd-cache-autosize == b'0' INFO  tests.suites.config.autotuning_test:autotuning_test.py:312 nsslapd-cache-autosize-split == b'0'
Passed suites/config/autotuning_test.py::test_cache_autosize_basic_sane[] 9.25
-------------------------------Captured log call--------------------------------
INFO  tests.suites.config.autotuning_test:autotuning_test.py:273 Check nsslapd-dbcachesize and nsslapd-cachememsize before the test INFO  tests.suites.config.autotuning_test:autotuning_test.py:274 nsslapd-dbcachesize == b'33333333' INFO  tests.suites.config.autotuning_test:autotuning_test.py:275 nsslapd-cachememsize == b'33333333' INFO  tests.suites.config.autotuning_test:autotuning_test.py:276 nsslapd-cache-autosize == b'0' INFO  tests.suites.config.autotuning_test:autotuning_test.py:277 nsslapd-cache-autosize-split == b'0' INFO  tests.suites.config.autotuning_test:autotuning_test.py:283 Delete nsslapd-cache-autosize-split INFO  tests.suites.config.autotuning_test:autotuning_test.py:289 Set nsslapd-dbcachesize to 0 INFO  tests.suites.config.autotuning_test:autotuning_test.py:291 Set nsslapd-cachememsize to 0 INFO  tests.suites.config.autotuning_test:autotuning_test.py:307 Check nsslapd-dbcachesize and nsslapd-cachememsize in the appropriate range. INFO  tests.suites.config.autotuning_test:autotuning_test.py:308 nsslapd-dbcachesize == b'408620032' INFO  tests.suites.config.autotuning_test:autotuning_test.py:309 nsslapd-cachememsize == b'1409286144' INFO  tests.suites.config.autotuning_test:autotuning_test.py:310 nsslapd-dncachememsize == b'67108864' INFO  tests.suites.config.autotuning_test:autotuning_test.py:311 nsslapd-cache-autosize == b'0' INFO  tests.suites.config.autotuning_test:autotuning_test.py:312 nsslapd-cache-autosize-split == b'25' INFO  tests.suites.config.autotuning_test:autotuning_test.py:273 Check nsslapd-dbcachesize and nsslapd-cachememsize before the test INFO  tests.suites.config.autotuning_test:autotuning_test.py:274 nsslapd-dbcachesize == b'408620032' INFO  tests.suites.config.autotuning_test:autotuning_test.py:275 nsslapd-cachememsize == b'1409286144' INFO  tests.suites.config.autotuning_test:autotuning_test.py:276 nsslapd-cache-autosize == b'0' INFO  tests.suites.config.autotuning_test:autotuning_test.py:277 nsslapd-cache-autosize-split == b'25' INFO  tests.suites.config.autotuning_test:autotuning_test.py:283 Delete nsslapd-cache-autosize-split INFO  tests.suites.config.autotuning_test:autotuning_test.py:289 Set nsslapd-dbcachesize to 33333333 INFO  tests.suites.config.autotuning_test:autotuning_test.py:291 Set nsslapd-cachememsize to 33333333 INFO  tests.suites.config.autotuning_test:autotuning_test.py:307 Check nsslapd-dbcachesize and nsslapd-cachememsize in the appropriate range. INFO  tests.suites.config.autotuning_test:autotuning_test.py:308 nsslapd-dbcachesize == b'33333333' INFO  tests.suites.config.autotuning_test:autotuning_test.py:309 nsslapd-cachememsize == b'33333333' INFO  tests.suites.config.autotuning_test:autotuning_test.py:310 nsslapd-dncachememsize == b'67108864' INFO  tests.suites.config.autotuning_test:autotuning_test.py:311 nsslapd-cache-autosize == b'0' INFO  tests.suites.config.autotuning_test:autotuning_test.py:312 nsslapd-cache-autosize-split == b'25'
Passed suites/config/autotuning_test.py::test_cache_autosize_basic_sane[40] 9.32
-------------------------------Captured log call--------------------------------
INFO  tests.suites.config.autotuning_test:autotuning_test.py:273 Check nsslapd-dbcachesize and nsslapd-cachememsize before the test INFO  tests.suites.config.autotuning_test:autotuning_test.py:274 nsslapd-dbcachesize == b'33333333' INFO  tests.suites.config.autotuning_test:autotuning_test.py:275 nsslapd-cachememsize == b'33333333' INFO  tests.suites.config.autotuning_test:autotuning_test.py:276 nsslapd-cache-autosize == b'0' INFO  tests.suites.config.autotuning_test:autotuning_test.py:277 nsslapd-cache-autosize-split == b'25' INFO  tests.suites.config.autotuning_test:autotuning_test.py:280 Set nsslapd-cache-autosize-split to 40 INFO  tests.suites.config.autotuning_test:autotuning_test.py:289 Set nsslapd-dbcachesize to 0 INFO  tests.suites.config.autotuning_test:autotuning_test.py:291 Set nsslapd-cachememsize to 0 INFO  tests.suites.config.autotuning_test:autotuning_test.py:307 Check nsslapd-dbcachesize and nsslapd-cachememsize in the appropriate range. INFO  tests.suites.config.autotuning_test:autotuning_test.py:308 nsslapd-dbcachesize == b'817240064' INFO  tests.suites.config.autotuning_test:autotuning_test.py:309 nsslapd-cachememsize == b'1140850688' INFO  tests.suites.config.autotuning_test:autotuning_test.py:310 nsslapd-dncachememsize == b'67108864' INFO  tests.suites.config.autotuning_test:autotuning_test.py:311 nsslapd-cache-autosize == b'0' INFO  tests.suites.config.autotuning_test:autotuning_test.py:312 nsslapd-cache-autosize-split == b'40' INFO  tests.suites.config.autotuning_test:autotuning_test.py:273 Check nsslapd-dbcachesize and nsslapd-cachememsize before the test INFO  tests.suites.config.autotuning_test:autotuning_test.py:274 nsslapd-dbcachesize == b'817240064' INFO  tests.suites.config.autotuning_test:autotuning_test.py:275 nsslapd-cachememsize == b'1140850688' INFO  tests.suites.config.autotuning_test:autotuning_test.py:276 nsslapd-cache-autosize == b'0' INFO  tests.suites.config.autotuning_test:autotuning_test.py:277 nsslapd-cache-autosize-split == b'40' INFO  tests.suites.config.autotuning_test:autotuning_test.py:280 Set nsslapd-cache-autosize-split to 40 INFO  tests.suites.config.autotuning_test:autotuning_test.py:289 Set nsslapd-dbcachesize to 33333333 INFO  tests.suites.config.autotuning_test:autotuning_test.py:291 Set nsslapd-cachememsize to 33333333 INFO  tests.suites.config.autotuning_test:autotuning_test.py:307 Check nsslapd-dbcachesize and nsslapd-cachememsize in the appropriate range. INFO  tests.suites.config.autotuning_test:autotuning_test.py:308 nsslapd-dbcachesize == b'33333333' INFO  tests.suites.config.autotuning_test:autotuning_test.py:309 nsslapd-cachememsize == b'33333333' INFO  tests.suites.config.autotuning_test:autotuning_test.py:310 nsslapd-dncachememsize == b'67108864' INFO  tests.suites.config.autotuning_test:autotuning_test.py:311 nsslapd-cache-autosize == b'0' INFO  tests.suites.config.autotuning_test:autotuning_test.py:312 nsslapd-cache-autosize-split == b'40'
Passed suites/config/autotuning_test.py::test_cache_autosize_invalid_values[-2] 0.18
-------------------------------Captured log call--------------------------------
INFO  tests.suites.config.autotuning_test:autotuning_test.py:346 Set nsslapd-cache-autosize-split to -2 INFO  tests.suites.config.autotuning_test:autotuning_test.py:352 Set nsslapd-cache-autosize to -2
Passed suites/config/autotuning_test.py::test_cache_autosize_invalid_values[102] 0.29
-------------------------------Captured log call--------------------------------
INFO  tests.suites.config.autotuning_test:autotuning_test.py:346 Set nsslapd-cache-autosize-split to 102 INFO  tests.suites.config.autotuning_test:autotuning_test.py:352 Set nsslapd-cache-autosize to 102
Passed suites/config/autotuning_test.py::test_cache_autosize_invalid_values[invalid] 0.04
-------------------------------Captured log call--------------------------------
INFO  tests.suites.config.autotuning_test:autotuning_test.py:346 Set nsslapd-cache-autosize-split to invalid INFO  tests.suites.config.autotuning_test:autotuning_test.py:352 Set nsslapd-cache-autosize to invalid
Passed suites/config/config_test.py::test_defaultnamingcontext 1.83
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.config.config_test:config_test.py:220 Check the attribute nsslapd-defaultnamingcontext is present in cn=config INFO  tests.suites.config.config_test:config_test.py:223 Delete nsslapd-defaultnamingcontext attribute INFO  tests.suites.config.config_test:config_test.py:230 modify nsslapd-defaultnamingcontext with new suffix INFO  tests.suites.config.config_test:config_test.py:233 Add new invalid value at runtime to nsslapd-defaultnamingcontext INFO  tests.suites.config.config_test:config_test.py:237 Modify nsslapd-defaultnamingcontext with blank value INFO  tests.suites.config.config_test:config_test.py:240 Add new suffix when nsslapd-defaultnamingcontext is empty INFO  tests.suites.config.config_test:config_test.py:244 Check the value of the nsslapd-defaultnamingcontext automatically have the new suffix INFO  tests.suites.config.config_test:config_test.py:247 Adding new suffix when nsslapd-defaultnamingcontext is not empty INFO  tests.suites.config.config_test:config_test.py:251 Check the value of the nsslapd-defaultnamingcontext has not changed INFO  tests.suites.config.config_test:config_test.py:254 Remove the newly added suffix and check the values of the attribute is not changed INFO  tests.suites.config.config_test:config_test.py:258 Remove all the suffix at the end
Passed suites/config/config_test.py::test_allow_add_delete_config_attributes 5.19
-------------------------------Captured log call--------------------------------
INFO  tests.suites.config.config_test:config_test.py:308 Add a new valid attribute at runtime to cn=config INFO  tests.suites.config.config_test:config_test.py:312 Delete nsslapd-listenhost to restore the default value INFO  tests.suites.config.config_test:config_test.py:317 Add new invalid attribute at runtime to cn=config INFO  tests.suites.config.config_test:config_test.py:321 Make sure the invalid attribute is not added
Passed suites/config/config_test.py::test_ignore_virtual_attrs 0.35
-------------------------------Captured log call--------------------------------
INFO  tests.suites.config.config_test:config_test.py:354 Check the attribute nsslapd-ignore-virtual-attrs is present in cn=config INFO  tests.suites.config.config_test:config_test.py:357 Check the default value of attribute nsslapd-ignore-virtual-attrs should be OFF INFO  tests.suites.config.config_test:config_test.py:360 Set the valid values i.e. on/ON and off/OFF for nsslapd-ignore-virtual-attrs INFO  tests.suites.config.config_test:config_test.py:365 Set invalid value for attribute nsslapd-ignore-virtual-attrs INFO  tests.suites.config.config_test:config_test.py:376 Add cosPointer, cosTemplate and test entry to default suffix, where virtual attribute is postal code INFO  tests.suites.config.config_test:config_test.py:389 Test if virtual attribute i.e. postal code shown in test entry while nsslapd-ignore-virtual-attrs: off INFO  tests.suites.config.config_test:config_test.py:392 Set nsslapd-ignore-virtual-attrs=on INFO  tests.suites.config.config_test:config_test.py:395 Test if virtual attribute i.e. postal code not shown while nsslapd-ignore-virtual-attrs: on
Passed suites/config/config_test.py::test_ndn_cache_enabled 8.41
-------------------------------Captured log call--------------------------------
INFO  tests.suites.config.config_test:config_test.py:423 Check the attribute nsslapd-ndn-cache-enabled is present in cn=config INFO  tests.suites.config.config_test:config_test.py:426 Check the attribute nsslapd-ndn-cache-enabled has the default value set as ON INFO  tests.suites.config.config_test:config_test.py:429 Check the attribute nsslapd-ndn-cache-max-size is present in cn=config INFO  tests.suites.config.config_test:config_test.py:435 Ticket#49593 : NDN cache stats should be under the global stats - Implemented in 1.4 INFO  tests.suites.config.config_test:config_test.py:436 Fetch the monitor value according to the ds version INFO  tests.suites.config.config_test:config_test.py:442 Check the backend monitor output for Normalized DN cache statistics, while nsslapd-ndn-cache-enabled is off INFO  tests.suites.config.config_test:config_test.py:448 Check the backend monitor output for Normalized DN cache statistics, while nsslapd-ndn-cache-enabled is on INFO  tests.suites.config.config_test:config_test.py:454 Set invalid value for nsslapd-ndn-cache-enabled INFO  tests.suites.config.config_test:config_test.py:458 Set invalid value for nsslapd-ndn-cache-max-size
Passed suites/config/config_test.py::test_require_index 0.67
No log output captured.
Passed suites/config/config_test.py::test_require_internal_index 6.40
No log output captured.
Passed suites/config/regression_test.py::test_maxbersize_repl 7.19
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.config.regression_test:regression_test.py:100 Set nsslapd-errorlog-maxlogsize before nsslapd-errorlog-logmaxdiskspace INFO  tests.suites.config.regression_test:regression_test.py:104 Assert no init_dse_file errors in the error log INFO  tests.suites.config.regression_test:regression_test.py:108 Set nsslapd-errorlog-maxlogsize after nsslapd-errorlog-logmaxdiskspace INFO  tests.suites.config.regression_test:regression_test.py:112 Assert no init_dse_file errors in the error log
Passed suites/config/removed_config_49298_test.py::test_restore_config 3.48
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.config.removed_config_49298_test:removed_config_49298_test.py:43 /etc/dirsrv/slapd-standalone1
Passed suites/config/removed_config_49298_test.py::test_removed_config 2.40
-------------------------------Captured log call--------------------------------
INFO  tests.suites.config.removed_config_49298_test:removed_config_49298_test.py:72 /etc/dirsrv/slapd-standalone1
Passed suites/cos/cos_test.py::test_positive 0.29
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/cos/indirect_cos_test.py::test_indirect_cos 1.54
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created. INFO  tests.suites.cos.indirect_cos_test:indirect_cos_test.py:107 Add custom schema... INFO  tests.suites.cos.indirect_cos_test:indirect_cos_test.py:120 Add test user... INFO  tests.suites.cos.indirect_cos_test:indirect_cos_test.py:137 Setup indirect COS...
-------------------------------Captured log call--------------------------------
INFO  tests.suites.cos.indirect_cos_test:indirect_cos_test.py:157 Checking user... INFO  tests.suites.cos.indirect_cos_test:indirect_cos_test.py:56 Create password policy for subtree ou=people,dc=example,dc=com INFO  tests.suites.cos.indirect_cos_test:indirect_cos_test.py:164 Checking user...
Passed suites/disk_monitoring/disk_monitoring_test.py::test_verify_operation_when_disk_monitoring_is_off 4.58
-----------------------------Captured stdout setup------------------------------
Relabeled /var/log/dirsrv/slapd-standalone1 from unconfined_u:object_r:user_tmp_t:s0 to system_u:object_r:dirsrv_var_log_t:s0
-----------------------------Captured stderr setup------------------------------
chown: cannot access '/var/log/dirsrv/slapd-standalone1/*': No such file or directory
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
------------------------------Captured stderr call------------------------------
25+0 records in 25+0 records out 26214400 bytes (26 MB, 25 MiB) copied, 0.0166991 s, 1.6 GB/s dd: error writing '/var/log/dirsrv/slapd-standalone1/foo1': No space left on device 10+0 records in 9+0 records out 10465280 bytes (10 MB, 10 MiB) copied, 0.00775763 s, 1.3 GB/s
Passed suites/disk_monitoring/disk_monitoring_test.py::test_free_up_the_disk_space_and_change_ds_config 4.37
No log output captured.
Passed suites/disk_monitoring/disk_monitoring_test.py::test_verify_operation_with_nsslapd_disk_monitoring_logging_critical_off 34.67
------------------------------Captured stderr call------------------------------
10+0 records in 10+0 records out 10485760 bytes (10 MB, 10 MiB) copied, 0.0060443 s, 1.7 GB/s
Passed suites/disk_monitoring/disk_monitoring_test.py::test_operation_with_nsslapd_disk_monitoring_logging_critical_on_below_half_of_the_threshold 25.44
------------------------------Captured stderr call------------------------------
31+0 records in 31+0 records out 32505856 bytes (33 MB, 31 MiB) copied, 0.0341846 s, 951 MB/s
Passed suites/disk_monitoring/disk_monitoring_test.py::test_setting_nsslapd_disk_monitoring_logging_critical_to_off 3.41
No log output captured.
Passed suites/disk_monitoring/disk_monitoring_test.py::test_operation_with_nsslapd_disk_monitoring_logging_critical_off 64.97
------------------------------Captured stderr call------------------------------
10+0 records in 10+0 records out 10485760 bytes (10 MB, 10 MiB) copied, 0.0107861 s, 972 MB/s
Passed suites/disk_monitoring/disk_monitoring_test.py::test_operation_with_nsslapd_disk_monitoring_logging_critical_off_below_half_of_the_threshold 155.75
------------------------------Captured stderr call------------------------------
30+0 records in 30+0 records out 31457280 bytes (31 MB, 30 MiB) copied, 0.0159844 s, 2.0 GB/s
Passed suites/disk_monitoring/disk_monitoring_test.py::test_go_straight_below_half_of_the_threshold 107.60
------------------------------Captured stderr call------------------------------
31+0 records in 31+0 records out 32505856 bytes (33 MB, 31 MiB) copied, 0.0342783 s, 948 MB/s
Passed suites/disk_monitoring/disk_monitoring_test.py::test_readonly_on_threshold 28.15
------------------------------Captured stderr call------------------------------
10+0 records in 10+0 records out 10485760 bytes (10 MB, 10 MiB) copied, 0.00869365 s, 1.2 GB/s
Passed suites/disk_monitoring/disk_monitoring_test.py::test_readonly_on_threshold_below_half_of_the_threshold 49.81
------------------------------Captured stderr call------------------------------
31+0 records in 31+0 records out 32505856 bytes (33 MB, 31 MiB) copied, 0.0305455 s, 1.1 GB/s
Passed suites/disk_monitoring/disk_monitoring_test.py::test_below_half_of_the_threshold_not_starting_after_shutdown 112.19
------------------------------Captured stderr call------------------------------
31+0 records in 31+0 records out 32505856 bytes (33 MB, 31 MiB) copied, 0.0148017 s, 2.2 GB/s
-------------------------------Captured log call--------------------------------
INFO  lib389:disk_monitoring_test.py:582 Instance start up has failed as expected
Passed suites/disk_monitoring/disk_monitoring_test.py::test_go_straight_below_4kb 18.15
------------------------------Captured stderr call------------------------------
25+0 records in 25+0 records out 26214400 bytes (26 MB, 25 MiB) copied, 0.031183 s, 841 MB/s dd: error writing '/var/log/dirsrv/slapd-standalone1/foo1': No space left on device 10+0 records in 9+0 records out 10321920 bytes (10 MB, 9.8 MiB) copied, 0.00676857 s, 1.5 GB/s
Passed suites/disk_monitoring/disk_monitoring_test.py::test_threshold_to_overflow_value 0.03
No log output captured.
Passed suites/disk_monitoring/disk_monitoring_test.py::test_threshold_is_reached_to_half 14.41
------------------------------Captured stderr call------------------------------
10+0 records in 10+0 records out 10485760 bytes (10 MB, 10 MiB) copied, 0.00807693 s, 1.3 GB/s
Passed suites/disk_monitoring/disk_monitoring_test.py::test_negagtive_parameterize[nsslapd-disk-monitoring-threshold--2] 0.00
No log output captured.
Passed suites/disk_monitoring/disk_monitoring_test.py::test_negagtive_parameterize[nsslapd-disk-monitoring-threshold-9223372036854775808] 0.00
No log output captured.
Passed suites/disk_monitoring/disk_monitoring_test.py::test_negagtive_parameterize[nsslapd-disk-monitoring-threshold-2047] 0.00
No log output captured.
Passed suites/disk_monitoring/disk_monitoring_test.py::test_negagtive_parameterize[nsslapd-disk-monitoring-threshold-0] 0.00
No log output captured.
Passed suites/disk_monitoring/disk_monitoring_test.py::test_negagtive_parameterize[nsslapd-disk-monitoring-threshold--1294967296] 0.00
No log output captured.
Passed suites/disk_monitoring/disk_monitoring_test.py::test_negagtive_parameterize[nsslapd-disk-monitoring-threshold-invalid] 0.00
No log output captured.
Passed suites/disk_monitoring/disk_monitoring_test.py::test_negagtive_parameterize[nsslapd-disk-monitoring-invalid] 0.00
No log output captured.
Passed suites/disk_monitoring/disk_monitoring_test.py::test_negagtive_parameterize[nsslapd-disk-monitoring-1] 0.00
No log output captured.
Passed suites/disk_monitoring/disk_monitoring_test.py::test_negagtive_parameterize[nsslapd-disk-monitoring-grace-period-00] 0.00
No log output captured.
Passed suites/disk_monitoring/disk_monitoring_test.py::test_negagtive_parameterize[nsslapd-disk-monitoring-grace-period-525 948] 0.00
No log output captured.
Passed suites/disk_monitoring/disk_monitoring_test.py::test_negagtive_parameterize[nsslapd-disk-monitoring-grace-period--10] 0.00
No log output captured.
Passed suites/disk_monitoring/disk_monitoring_test.py::test_negagtive_parameterize[nsslapd-disk-monitoring-logging-critical-oninvalid] 0.00
No log output captured.
Passed suites/disk_monitoring/disk_monitoring_test.py::test_negagtive_parameterize[nsslapd-disk-monitoring-grace-period--11] 0.00
No log output captured.
Passed suites/disk_monitoring/disk_monitoring_test.py::test_negagtive_parameterize[nsslapd-disk-monitoring-grace-period-01] 0.00
No log output captured.
Passed suites/disk_monitoring/disk_monitoring_test.py::test_valid_operations_are_permitted 3.88
No log output captured.
Passed suites/disk_monitoring/disk_space_test.py::test_basic 0.00
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  lib389:disk_space_test.py:37 Check that "partition", "size", "used", "available", "use%" words are present in the string INFO  lib389:disk_space_test.py:41 Check that the sizes are numbers
Passed suites/ds_logs/ds_logs_test.py::test_check_default 0.17
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
DEBUG  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:233 on
Passed suites/ds_logs/ds_logs_test.py::test_plugin_set_invalid 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:252 test_plugin_set_invalid - Expect to fail with junk value
Passed suites/ds_logs/ds_logs_test.py::test_log_plugin_on 4.33
-------------------------------Captured log call--------------------------------
INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:279 Bug 1273549 - Check access logs for millisecond, when attribute is ON INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:280 perform any ldap operation, which will trigger the logs INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:36 Adding 10 users INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:284 Restart the server to flush the logs INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:287 parse the access logs
Passed suites/ds_logs/ds_logs_test.py::test_log_plugin_off 12.52
-------------------------------Captured log call--------------------------------
INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:319 Bug 1273549 - Check access logs for missing millisecond, when attribute is OFF INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:321 test_log_plugin_off - set the configuration attribute to OFF INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:324 Restart the server to flush the logs INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:327 test_log_plugin_off - delete the previous access logs INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:36 Adding 10 users INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:334 Restart the server to flush the logs INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:337 check access log that microseconds are not present
Passed suites/ds_logs/ds_logs_test.py::test_internal_log_server_level_0 4.67
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:200 Disable access log buffering
-------------------------------Captured log call--------------------------------
INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:365 Set nsslapd-plugin-logging to on INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:368 Configure access log level to 0 INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:372 Restart the server to flush the logs INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:376 Check if access log does not contain internal log of MOD operation INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:382 Check if the other internal operations are not present
Passed suites/ds_logs/ds_logs_test.py::test_internal_log_server_level_4 6.20
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:200 Disable access log buffering
-------------------------------Captured log call--------------------------------
INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:414 Set nsslapd-plugin-logging to on INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:417 Configure access log level to 4 INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:421 Restart the server to flush the logs INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:426 Check if access log contains internal MOD operation in correct format INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:432 Check if the other internal operations have the correct format
Passed suites/ds_logs/ds_logs_test.py::test_internal_log_level_260 7.07
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:102 Enable automember plugin INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:106 Enable Referential Integrity plugin INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:110 Set nsslapd-plugin-logging to on INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:113 Restart the server INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:120 Configure access log level to 260 INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:86 Renaming user INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:89 Delete the user INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:92 Delete automember entry, org. unit and group for the next test INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:200 Disable access log buffering
-------------------------------Captured log call--------------------------------
INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:472 Restart the server to flush the logs INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:476 Check the access logs for ADD operation of the user INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:492 Check the access logs for MOD operation of the user INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:506 Check the access logs for DEL operation of the user INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:518 Check if the other internal operations have the correct format
Passed suites/ds_logs/ds_logs_test.py::test_internal_log_level_131076 6.16
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:120 Configure access log level to 131076 INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:86 Renaming user INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:89 Delete the user INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:92 Delete automember entry, org. unit and group for the next test INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:200 Disable access log buffering
-------------------------------Captured log call--------------------------------
INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:557 Restart the server to flush the logs INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:561 Check the access logs for ADD operation of the user INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:577 Check the access logs for MOD operation of the user INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:591 Check the access logs for DEL operation of the user INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:603 Check if the other internal operations have the correct format
Passed suites/ds_logs/ds_logs_test.py::test_internal_log_level_516 6.99
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:120 Configure access log level to 516 INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:86 Renaming user INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:89 Delete the user INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:92 Delete automember entry, org. unit and group for the next test INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:200 Disable access log buffering
-------------------------------Captured log call--------------------------------
INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:642 Restart the server to flush the logs INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:646 Check the access logs for ADD operation of the user INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:665 Check the access logs for MOD operation of the user INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:682 Check the access logs for DEL operation of the user INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:697 Check if the other internal operations have the correct format
Passed suites/ds_logs/ds_logs_test.py::test_access_log_truncated_search_message 4.83
-------------------------------Captured log call--------------------------------
INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:724 Make a search INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:727 Restart the server to flush the logs
Passed suites/ds_logs/ds_logs_test.py::test_etime_at_border_of_second 22.39
No log output captured.
Passed suites/ds_logs/ds_logs_test.py::test_etime_order_of_magnitude 2.31
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:200 Disable access log buffering
-------------------------------Captured log call--------------------------------
INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:814 add_users INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:36 Adding 30 users INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:817 search users INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:820 parse the access logs to get the SRCH string INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:828 get the operation start time from the SRCH string INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:832 get the OP number from the SRCH string INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:836 get the RESULT string matching the SRCH OP number INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:844 get the operation end time from the RESULT string INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:848 get the logged etime for the operation from the RESULT string INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:852 Calculate the ratio between logged etime for the operation and elapsed time from its start time to its end time - should be around 1
Passed suites/ds_logs/ds_logs_test.py::test_log_base_dn_when_invalid_attr_request 3.66
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:200 Disable access log buffering
-------------------------------Captured log call--------------------------------
INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:884 delete the previous access logs to get a fresh new one INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:887 Search the default suffix, with invalid '"" ""' attribute request INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:888 A Protocol error exception should be raised, see https://github.com/389ds/389-ds-base/issues/3028 INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:894 Check the access logs for correct messages
Passed suites/ds_logs/ds_logs_test.py::test_audit_log_rotate_and_check_string 12.02
-------------------------------Captured log call--------------------------------
INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:940 Doing modifications to rotate audit log INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:945 Doing one more modification just in case INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:952 Check that DS string is present on first line INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:956 Check that DS string is present only once
Passed suites/ds_logs/regression_test.py::test_default_loglevel_stripped[24576] 0.25
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/ds_logs/regression_test.py::test_default_loglevel_stripped[16512] 0.03
No log output captured.
Passed suites/ds_logs/regression_test.py::test_default_loglevel_stripped[16385] 0.97
No log output captured.
Passed suites/ds_logs/regression_test.py::test_dse_config_loglevel_error 23.26
No log output captured.
Passed suites/ds_tools/logpipe_test.py::test_user_permissions 0.06
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created. INFO  tests.suites.ds_tools.logpipe_test:logpipe_test.py:32 Add system test user - dirsrv_testuser
-------------------------------Captured log call--------------------------------
INFO  tests.suites.ds_tools.logpipe_test:logpipe_test.py:68 Try to create a logpipe in the log directory with "-u" option specifying the user
Passed suites/export/export_test.py::test_dbtasks_db2ldif_with_non_accessible_ldif_file_path 4.65
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
------------------------------Captured stderr call------------------------------
ldiffile: /tmp/nonexistent/export.ldif
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:export_test.py:63 Stopping the instance... INFO  lib389.utils:export_test.py:66 Performing an offline export to a non accessible ldif file path - should fail properly CRITICAL LogCapture:dbtasks.py:40 db2ldif failed INFO  lib389.utils:export_test.py:33 checking output msg INFO  lib389.utils:export_test.py:38 Clear the log INFO  lib389.utils:export_test.py:70 parsing the errors log to search for the error reported INFO  lib389.utils:export_test.py:79 Restarting the instance...
Passed suites/export/export_test.py::test_db2ldif_cli_with_non_accessible_ldif_file_path 3.80
------------------------------Captured stdout call------------------------------
Exported ldif file: /tmp/nonexistent/export.ldif
------------------------------Captured stderr call------------------------------
[20/Oct/2020:19:48:37.529916093 -0400] - INFO - slapd_exemode_db2ldif - db2ldif - Backend Instance(s): [20/Oct/2020:19:48:37.538045581 -0400] - INFO - slapd_exemode_db2ldif - db2ldif - userRoot ldiffile: /tmp/nonexistent/export.ldif
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:export_test.py:104 Stopping the instance... INFO  lib389.utils:export_test.py:107 Performing an offline export to a non accessible ldif file path - should fail properly INFO  lib389.utils:export_test.py:115 db2ldif failed properly: error (255) INFO  lib389.utils:export_test.py:118 parsing the errors log to search for the error reported INFO  lib389.utils:export_test.py:121 error string : '[20/Oct/2020:19:48:33.734698513 -0400] - ERR - bdb_db2ldif - db2ldif: userRoot: can\'t open /tmp/nonexistent/export.ldif: 2 (No such file or directory) while running as user "dirsrv"\n', '[20/Oct/2020:19:48:37.549905534 -0400] - ERR - bdb_db2ldif - db2ldif: userRoot: can\'t open /tmp/nonexistent/export.ldif: 2 (No such file or directory) while running as user "dirsrv"\n' INFO  lib389.utils:export_test.py:123 Restarting the instance...
Passed suites/filter/basic_filter_test.py::test_search_attr 0.59
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/filter/bitw_filter_test.py::test_bitwise_plugin_status 0.00
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/filter/bitw_filter_test.py::test_search_disabled_accounts 0.01
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_plugin_can_be_disabled 5.30
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_plugin_is_disabled 0.00
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_enabling_works_fine 4.68
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_all_together[(& (objectclass=testperson) (testUserAccountControl:1.2.840.113556.1.4.803:=513))-1] 0.00
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_all_together[(& (objectclass=testperson) (testUserAccountControl:1.2.840.113556.1.4.803:=16777216))-1] 0.00
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_all_together[(& (objectclass=testperson) (testUserAccountControl:1.2.840.113556.1.4.803:=8388608))-1] 0.17
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_all_together[(& (objectclass=testperson) (testUserAccountControl:1.2.840.113556.1.4.804:=5))-3] 0.17
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_all_together[(& (objectclass=testperson) (testUserAccountControl:1.2.840.113556.1.4.804:=8))-3] 0.00
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_all_together[(& (objectclass=testperson) (testUserAccountControl:1.2.840.113556.1.4.804:=7))-5] 0.01
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_all_together[(& (objectclass=testerperson) (testUserAccountControl:1.2.840.113556.1.4.804:=7))-0] 0.14
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_all_together[(& (objectclass=testperson) (&(testUserAccountControl:1.2.840.113556.1.4.803:=98536)(testUserAccountControl:1.2.840.113556.1.4.803:=912)))-0] 0.16
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_all_together[(& (objectclass=testperson) (&(testUserAccountControl:1.2.840.113556.1.4.804:=87)(testUserAccountControl:1.2.840.113556.1.4.804:=91)))-8] 0.17
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_all_together[(& (objectclass=testperson) (&(testUserAccountControl:1.2.840.113556.1.4.803:=89536)(testUserAccountControl:1.2.840.113556.1.4.804:=79)))-1] 0.00
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_all_together[(& (objectclass=testperson) (|(testUserAccountControl:1.2.840.113556.1.4.803:=89536)(testUserAccountControl:1.2.840.113556.1.4.804:=79)))-8] 0.00
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_all_together[(& (objectclass=testperson) (|(testUserAccountControl:1.2.840.113556.1.4.803:=89)(testUserAccountControl:1.2.840.113556.1.4.803:=536)))-0] 0.00
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_all_together[(& (objectclass=testperson) (testUserAccountControl:1.2.840.113556.1.4.803:=x))-13] 0.01
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_all_together[(& (objectclass=testperson) (testUserAccountControl:1.2.840.113556.1.4.803:=&\*#$%))-13] 0.01
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_all_together[(& (objectclass=testperson) (testUserAccountControl:1.2.840.113556.1.4.803:=-65536))-0] 0.01
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_all_together[(& (objectclass=testperson) (testUserAccountControl:1.2.840.113556.1.4.803:=-1))-0] 0.00
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_all_together[(& (objectclass=testperson) (testUserAccountControl:1.2.840.113556.1.4.803:=-))-13] 0.01
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_all_together[(& (objectclass=testperson) (testUserAccountControl:1.2.840.113556.1.4.803:=))-13] 0.01
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_all_together[(& (objectclass=testperson) (testUserAccountControl:1.2.840.113556.1.4.803:=\*))-13] 0.01
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_all_together[(& (objectclass=testperson) (testUserAccountControl:1.2.840.113556.1.4.804:=\*))-0] 0.00
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_all_together[(& (objectclass=testperson) (testUserAccountControl:1.2.840.113556.1.4.803:=6552))-0] 0.01
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_all_together[(& (objectclass=testperson\))(testUserAccountControl:1.2.840.113556.1.4.804:=6552))-0] 0.00
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_all_together[(& (objectclass=testperson) (testUserAccountControl:1.2.840.113556.1.4.803:=65536))-5] 0.00
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_5_entries 0.21
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_5_entries1 0.22
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_5_entries3 0.02
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_5_entries4 0.02
No log output captured.
Passed suites/filter/complex_filters_test.py::test_filters[(&(uid=uid1)(sn=last1)(givenname=first1))-1] 0.00
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:122 Testing filter "(&(uid=uid1)(sn=last1)(givenname=first1))"...
Passed suites/filter/complex_filters_test.py::test_filters[(&(uid=uid1)(&(sn=last1)(givenname=first1)))-1] 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:122 Testing filter "(&(uid=uid1)(&(sn=last1)(givenname=first1)))"...
Passed suites/filter/complex_filters_test.py::test_filters[(&(uid=uid1)(&(&(sn=last1))(&(givenname=first1))))-1] 0.01
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:122 Testing filter "(&(uid=uid1)(&(&(sn=last1))(&(givenname=first1))))"...
Passed suites/filter/complex_filters_test.py::test_filters[(&(uid=*)(sn=last3)(givenname=*))-1] 0.01
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:122 Testing filter "(&(uid=*)(sn=last3)(givenname=*))"...
Passed suites/filter/complex_filters_test.py::test_filters[(&(uid=*)(&(sn=last3)(givenname=*)))-1] 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:122 Testing filter "(&(uid=*)(&(sn=last3)(givenname=*)))"...
Passed suites/filter/complex_filters_test.py::test_filters[(&(uid=uid5)(&(&(sn=*))(&(givenname=*))))-1] 0.01
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:122 Testing filter "(&(uid=uid5)(&(&(sn=*))(&(givenname=*))))"...
Passed suites/filter/complex_filters_test.py::test_filters[(&(objectclass=*)(uid=*)(sn=last*))-5] 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:122 Testing filter "(&(objectclass=*)(uid=*)(sn=last*))"...
Passed suites/filter/complex_filters_test.py::test_filters[(&(objectclass=*)(uid=*)(sn=last1))-1] 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:122 Testing filter "(&(objectclass=*)(uid=*)(sn=last1))"...
Passed suites/filter/complex_filters_test.py::test_filters[(|(uid=uid1)(sn=last1)(givenname=first1))-1] 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:122 Testing filter "(|(uid=uid1)(sn=last1)(givenname=first1))"...
Passed suites/filter/complex_filters_test.py::test_filters[(|(uid=uid1)(|(sn=last1)(givenname=first1)))-1] 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:122 Testing filter "(|(uid=uid1)(|(sn=last1)(givenname=first1)))"...
Passed suites/filter/complex_filters_test.py::test_filters[(|(uid=uid1)(|(|(sn=last1))(|(givenname=first1))))-1] 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:122 Testing filter "(|(uid=uid1)(|(|(sn=last1))(|(givenname=first1))))"...
Passed suites/filter/complex_filters_test.py::test_filters[(|(objectclass=*)(sn=last1)(|(givenname=first1)))-14] 0.01
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:122 Testing filter "(|(objectclass=*)(sn=last1)(|(givenname=first1)))"...
Passed suites/filter/complex_filters_test.py::test_filters[(|(&(objectclass=*)(sn=last1))(|(givenname=first1)))-1] 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:122 Testing filter "(|(&(objectclass=*)(sn=last1))(|(givenname=first1)))"...
Passed suites/filter/complex_filters_test.py::test_filters[(|(&(objectclass=*)(sn=last))(|(givenname=first1)))-1] 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:122 Testing filter "(|(&(objectclass=*)(sn=last))(|(givenname=first1)))"...
Passed suites/filter/complex_filters_test.py::test_filters[(&(uid=uid1)(!(cn=NULL)))-1] 0.17
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:122 Testing filter "(&(uid=uid1)(!(cn=NULL)))"...
Passed suites/filter/complex_filters_test.py::test_filters[(&(!(cn=NULL))(uid=uid1))-1] 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:122 Testing filter "(&(!(cn=NULL))(uid=uid1))"...
Passed suites/filter/complex_filters_test.py::test_filters[(&(uid=*)(&(!(uid=1))(!(givenname=first1))))-4] 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:122 Testing filter "(&(uid=*)(&(!(uid=1))(!(givenname=first1))))"...
Passed suites/filter/complex_filters_test.py::test_filters[(&(|(uid=uid1)(uid=NULL))(sn=last1))-1] 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:122 Testing filter "(&(|(uid=uid1)(uid=NULL))(sn=last1))"...
Passed suites/filter/complex_filters_test.py::test_filters[(&(|(uid=uid1)(uid=NULL))(!(sn=NULL)))-1] 0.01
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:122 Testing filter "(&(|(uid=uid1)(uid=NULL))(!(sn=NULL)))"...
Passed suites/filter/complex_filters_test.py::test_filters[(&(|(uid=uid1)(sn=last2))(givenname=first1))-1] 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:122 Testing filter "(&(|(uid=uid1)(sn=last2))(givenname=first1))"...
Passed suites/filter/complex_filters_test.py::test_filters[(|(&(uid=uid1)(!(uid=NULL)))(sn=last2))-2] 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:122 Testing filter "(|(&(uid=uid1)(!(uid=NULL)))(sn=last2))"...
Passed suites/filter/complex_filters_test.py::test_filters[(|(&(uid=uid1)(uid=NULL))(sn=last2))-1] 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:122 Testing filter "(|(&(uid=uid1)(uid=NULL))(sn=last2))"...
Passed suites/filter/complex_filters_test.py::test_filters[(&(uid=uid5)(sn=*)(cn=*)(givenname=*)(uid=u*)(sn=la*)(cn=full*)(givenname=f*)(uid>=u)(!(givenname=NULL)))-1] 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:122 Testing filter "(&(uid=uid5)(sn=*)(cn=*)(givenname=*)(uid=u*)(sn=la*)(cn=full*)(givenname=f*)(uid>=u)(!(givenname=NULL)))"...
Passed suites/filter/complex_filters_test.py::test_filters[(|(&(objectclass=*)(sn=last))(&(givenname=first1)))-1] 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:122 Testing filter "(|(&(objectclass=*)(sn=last))(&(givenname=first1)))"...
Passed suites/filter/complex_filters_test.py::test_filters[(&(uid=uid1)(sn=last1)(givenname=NULL))-0] 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:122 Testing filter "(&(uid=uid1)(sn=last1)(givenname=NULL))"...
Passed suites/filter/complex_filters_test.py::test_filters[(&(uid=uid1)(&(sn=last1)(givenname=NULL)))-0] 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:122 Testing filter "(&(uid=uid1)(&(sn=last1)(givenname=NULL)))"...
Passed suites/filter/complex_filters_test.py::test_filters[(&(uid=uid1)(&(&(sn=last1))(&(givenname=NULL))))-0] 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:122 Testing filter "(&(uid=uid1)(&(&(sn=last1))(&(givenname=NULL))))"...
Passed suites/filter/complex_filters_test.py::test_filters[(&(uid=uid1)(&(&(sn=last1))(&(givenname=NULL)(sn=*)))(|(sn=NULL)))-0] 0.01
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:122 Testing filter "(&(uid=uid1)(&(&(sn=last1))(&(givenname=NULL)(sn=*)))(|(sn=NULL)))"...
Passed suites/filter/complex_filters_test.py::test_filters[(&(uid=uid1)(&(&(sn=last*))(&(givenname=first*)))(&(sn=NULL)))-0] 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:122 Testing filter "(&(uid=uid1)(&(&(sn=last*))(&(givenname=first*)))(&(sn=NULL)))"...
Passed suites/filter/complex_filters_test.py::test_filters[(|(uid=NULL)(sn=NULL)(givenname=NULL))-0] 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:122 Testing filter "(|(uid=NULL)(sn=NULL)(givenname=NULL))"...
Passed suites/filter/complex_filters_test.py::test_filters[(|(uid=NULL)(|(sn=NULL)(givenname=NULL)))-0] 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:122 Testing filter "(|(uid=NULL)(|(sn=NULL)(givenname=NULL)))"...
Passed suites/filter/complex_filters_test.py::test_filters[(|(uid=NULL)(|(|(sn=NULL))(|(givenname=NULL))))-0] 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:122 Testing filter "(|(uid=NULL)(|(|(sn=NULL))(|(givenname=NULL))))"...
Passed suites/filter/complex_filters_test.py::test_filters[(|(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*))-5] 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:122 Testing filter "(|(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*))"...
Passed suites/filter/complex_filters_test.py::test_filters[(uid>=uid3)-3] 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:122 Testing filter "(uid>=uid3)"...
Passed suites/filter/complex_filters_test.py::test_filters[(&(uid=*)(uid>=uid3))-3] 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:122 Testing filter "(&(uid=*)(uid>=uid3))"...
Passed suites/filter/complex_filters_test.py::test_filters[(|(uid>=uid3)(uid<=uid5))-5] 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:122 Testing filter "(|(uid>=uid3)(uid<=uid5))"...
Passed suites/filter/complex_filters_test.py::test_filters[(&(uid>=uid3)(uid<=uid5))-3] 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:122 Testing filter "(&(uid>=uid3)(uid<=uid5))"...
Passed suites/filter/complex_filters_test.py::test_filters[(|(&(uid>=uid3)(uid<=uid5))(uid=*))-5] 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:122 Testing filter "(|(&(uid>=uid3)(uid<=uid5))(uid=*))"...
Passed suites/filter/filter_cert_test.py::test_positive 21.98
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/filter/filter_index_match_test.py::test_valid_invalid_attributes[index0] 0.02
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/filter/filter_index_match_test.py::test_valid_invalid_attributes[index1] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_valid_invalid_attributes[index2] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_valid_invalid_attributes[index3] 0.02
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_valid_invalid_attributes[index4] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_valid_invalid_attributes[index5] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_valid_invalid_attributes[index6] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_valid_invalid_attributes[index7] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_valid_invalid_attributes[index8] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_valid_invalid_attributes[index9] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_valid_invalid_attributes[index10] 0.02
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_valid_invalid_attributes[index11] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_valid_invalid_attributes[index12] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_valid_invalid_attributes[index13] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_valid_invalid_attributes[index14] 0.02
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_valid_invalid_attributes[index15] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_valid_invalid_attributes[index16] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_valid_invalid_attributes[index17] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods[mod0] 0.04
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods[mod1] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods[mod2] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods[mod3] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods[mod4] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods[mod5] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods[mod6] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods[mod7] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods[mod8] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods[mod9] 0.17
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods[mod10] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods[mod11] 0.04
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods[mod12] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods[mod13] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods[mod14] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods[mod15] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods[mod16] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods[mod17] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_replace[mode0] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_replace[mode1] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_replace[mode2] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_replace[mode3] 0.06
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_replace[mode4] 0.05
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_replace[mode5] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_replace[mode6] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_replace[mode7] 0.04
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_replace[mode8] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_replace[mode9] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_replace[mode10] 0.04
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_replace[mode11] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_replace[mode12] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_replace[mode13] 0.04
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_replace[mode14] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_replace[mode15] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_replace[mode16] 0.04
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_replace[mode17] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_delete[mode0] 0.04
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_delete[mode1] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_delete[mode2] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_delete[mode3] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_delete[mode4] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_delete[mode5] 0.27
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_delete[mode6] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_delete[mode7] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_delete[mode8] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_delete[mode9] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_delete[mode10] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_delete[mode11] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_delete[mode12] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_delete[mode13] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_delete[mode14] 0.06
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_delete[mode15] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_delete[mode16] 0.26
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_delete[mode17] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_search_positive_negative 0.16
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrbitStringMatch:bitStringMatch:='0001'B)-1] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrcaseExactIA5Match:caseExactIA5Match:=Sprain)-1] 0.00
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrcaseExactMatch:caseExactMatch:=ÇélIné Ändrè)-1] 0.00
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrcaseExactMatch:caseExactOrderingMatch:=ÇélIné Ändrè)-5] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrgeneralizedTimeMatch:generalizedTimeMatch:=20100218171300Z)-1] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrgeneralizedTimeMatch:generalizedTimeOrderingMatch:=20100218171300Z)-6] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrbooleanMatch:booleanMatch:=TRUE)-1] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrcaseIgnoreIA5Match:caseIgnoreIA5Match:=sprain1)-1] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrcaseIgnoreMatch:caseIgnoreMatch:=ÇélIné Ändrè1)-1] 0.00
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrcaseIgnoreMatch:caseIgnoreOrderingMatch:=ÇélIné Ändrè1)-6] 0.00
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrcaseIgnoreListMatch:caseIgnoreListMatch:=foo1$bar)-1] 0.00
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrobjectIdentifierMatch:objectIdentifierMatch:=1.3.6.1.4.1.1466.115.121.1.15)-1] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrdirectoryStringFirstComponentMatch:directoryStringFirstComponentMatch:=ÇélIné Ändrè1)-1] 0.00
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrobjectIdentifierFirstComponentMatch:objectIdentifierFirstComponentMatch:=1.3.6.1.4.1.1466.115.121.1.15)-1] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrdistinguishedNameMatch:distinguishedNameMatch:=cn=foo1,cn=bar)-1] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrintegerMatch:integerMatch:=-2)-1] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrintegerMatch:integerOrderingMatch:=-2)-6] 0.00
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrintegerFirstComponentMatch:integerFirstComponentMatch:=-2)-1] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attruniqueMemberMatch:uniqueMemberMatch:=cn=foo1,cn=bar#'0001'B)-1] 0.00
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrnumericStringMatch:numericStringMatch:=00001)-10] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrnumericStringMatch:numericStringMatch:=00001)-11] 0.00
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrtelephoneNumberMatch:telephoneNumberMatch:=+1 408 555 4798)-1] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attroctetStringMatch:octetStringMatch:=AAAAAAAAAAAAAAE=)-1] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attroctetStringMatch:octetStringOrderingMatch:=AAAAAAAAAAAAAAE=)-6] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrcaseExactMatch=*ÇélIné Ändrè*)-1] 0.00
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrcaseExactMatch=ÇélIné Ändrè*)-1] 0.00
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrcaseExactMatch=*ÇélIné Ändrè)-1] 0.00
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrcaseExactMatch=*é Ä*)-5] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrcaseExactIA5Match=*Sprain*)-1] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrcaseExactIA5Match=Sprain*)-1] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrcaseExactIA5Match=*Sprain)-1] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrcaseExactIA5Match=*rai*)-3] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrcaseIgnoreIA5Match=*sprain1*)-1] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrcaseIgnoreIA5Match=sprain1*)-1] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrcaseIgnoreIA5Match=*sprain1)-1] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrcaseIgnoreIA5Match=*rai*)-6] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrcaseIgnoreMatch=*ÇélIné Ändrè1*)-1] 0.00
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrcaseIgnoreMatch=ÇélIné Ändrè1*)-1] 0.00
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrcaseIgnoreMatch=*ÇélIné Ändrè1)-1] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrcaseIgnoreMatch=*é Ä*)-6] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrcaseIgnoreListMatch=*foo1$bar*)-1] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrcaseIgnoreListMatch=foo1$bar*)-1] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrcaseIgnoreListMatch=*foo1$bar)-1] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrcaseIgnoreListMatch=*1$b*)-1] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrnumericStringMatch=*00001*)-1] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrnumericStringMatch=00001*)-1] 0.00
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrnumericStringMatch=*00001)-1] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrnumericStringMatch=*000*)-6] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrtelephoneNumberMatch=*+1 408 555 4798*)-1] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrtelephoneNumberMatch=+1 408 555 4798*)-1] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrtelephoneNumberMatch=*+1 408 555 4798)-1] 0.00
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrtelephoneNumberMatch=* 55*)-6] 0.01
No log output captured.
Passed suites/filter/filter_indexing_test.py::test_positive[(|(|(ou=nothing1)(ou=people))(|(ou=nothing2)(ou=nothing3)))] 0.01
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/filter/filter_indexing_test.py::test_positive[(|(|(ou=people)(ou=nothing1))(|(ou=nothing2)(ou=nothing3)))] 0.01
No log output captured.
Passed suites/filter/filter_indexing_test.py::test_positive[(|(|(ou=nothing1)(ou=nothing2))(|(ou=people)(ou=nothing3)))] 0.00
No log output captured.
Passed suites/filter/filter_indexing_test.py::test_positive[(|(|(ou=nothing1)(ou=nothing2))(|(ou=nothing3)(ou=people)))] 0.00
No log output captured.
Passed suites/filter/filter_indexing_test.py::test_positive[(&(sn<=0000000000000000)(givenname>=FFFFFFFFFFFFFFFF))] 0.01
No log output captured.
Passed suites/filter/filter_indexing_test.py::test_positive[(&(sn>=0000000000000000)(sn<=1111111111111111))] 0.00
No log output captured.
Passed suites/filter/filter_indexing_test.py::test_positive[(&(sn>=0000000000000000)(givenname<=FFFFFFFFFFFFFFFF))] 0.00
No log output captured.
Passed suites/filter/filter_indexing_test.py::test_indexing_schema 3.46
No log output captured.
Passed suites/filter/filter_indexing_test.py::test_indexing[(uidNumber=18446744073709551617)] 0.00
No log output captured.
Passed suites/filter/filter_indexing_test.py::test_indexing[(gidNumber=18446744073709551617)] 0.17
No log output captured.
Passed suites/filter/filter_indexing_test.py::test_indexing[(MYINTATTR=18446744073709551617)] 0.00
No log output captured.
Passed suites/filter/filter_indexing_test.py::test_indexing[(&(uidNumber=*)(!(uidNumber=18446744073709551617)))] 0.00
No log output captured.
Passed suites/filter/filter_indexing_test.py::test_indexing[(&(gidNumber=*)(!(gidNumber=18446744073709551617)))] 0.00
No log output captured.
Passed suites/filter/filter_indexing_test.py::test_indexing[(&(uidNumber=*)(!(gidNumber=18446744073709551617)))] 0.01
No log output captured.
Passed suites/filter/filter_indexing_test.py::test_indexing[(&(myintattr=*)(!(myintattr=18446744073709551617)))] 0.16
No log output captured.
Passed suites/filter/filter_indexing_test.py::test_indexing[(uidNumber>=-18446744073709551617)] 0.01
No log output captured.
Passed suites/filter/filter_indexing_test.py::test_indexing[(gidNumber>=-18446744073709551617)] 0.01
No log output captured.
Passed suites/filter/filter_indexing_test.py::test_indexing[(uidNumber<=18446744073709551617)] 0.00
No log output captured.
Passed suites/filter/filter_indexing_test.py::test_indexing[(gidNumber<=18446744073709551617)] 0.17
No log output captured.
Passed suites/filter/filter_indexing_test.py::test_indexing[(myintattr<=18446744073709551617)] 0.00
No log output captured.
Passed suites/filter/filter_indexing_test.py::test_indexing_negative[(gidNumber=54321)] 0.00
No log output captured.
Passed suites/filter/filter_indexing_test.py::test_indexing_negative[(uidNumber=54321)] 0.00
No log output captured.
Passed suites/filter/filter_indexing_test.py::test_indexing_negative[(myintattr=54321)] 0.00
No log output captured.
Passed suites/filter/filter_indexing_test.py::test_indexing_negative[(gidNumber<=-999999999999999999999999999999)] 0.00
No log output captured.
Passed suites/filter/filter_indexing_test.py::test_indexing_negative[(uidNumber<=-999999999999999999999999999999)] 0.00
No log output captured.
Passed suites/filter/filter_indexing_test.py::test_indexing_negative[(myintattr<=-999999999999999999999999999999)] 0.00
No log output captured.
Passed suites/filter/filter_indexing_test.py::test_indexing_negative[(gidNumber>=999999999999999999999999999999)] 0.00
No log output captured.
Passed suites/filter/filter_indexing_test.py::test_indexing_negative[(uidNumber>=999999999999999999999999999999)] 0.00
No log output captured.
Passed suites/filter/filter_indexing_test.py::test_indexing_negative[(myintattr>=999999999999999999999999999999)] 0.00
No log output captured.
Passed suites/filter/filter_logic_test.py::test_eq 0.00
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/filter/filter_logic_test.py::test_sub 0.01
No log output captured.
Passed suites/filter/filter_logic_test.py::test_not_eq 0.01
No log output captured.
Passed suites/filter/filter_logic_test.py::test_ranges 0.01
No log output captured.
Passed suites/filter/filter_logic_test.py::test_and_eq 0.01
No log output captured.
Passed suites/filter/filter_logic_test.py::test_range 0.00
No log output captured.
Passed suites/filter/filter_logic_test.py::test_and_allid_shortcut 0.01
No log output captured.
Passed suites/filter/filter_logic_test.py::test_or_eq 0.01
No log output captured.
Passed suites/filter/filter_logic_test.py::test_and_not_eq 0.01
No log output captured.
Passed suites/filter/filter_logic_test.py::test_or_not_eq 0.00
No log output captured.
Passed suites/filter/filter_logic_test.py::test_and_range 0.02
No log output captured.
Passed suites/filter/filter_logic_test.py::test_or_range 0.01
No log output captured.
Passed suites/filter/filter_logic_test.py::test_and_and_eq 0.02
No log output captured.
Passed suites/filter/filter_logic_test.py::test_or_or_eq 0.01
No log output captured.
Passed suites/filter/filter_logic_test.py::test_and_or_eq 0.02
No log output captured.
Passed suites/filter/filter_logic_test.py::test_or_and_eq 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_matching_rules 0.07
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/filter/filter_match_test.py::test_add_attribute_types 2.00
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_attributes[rule0] 0.50
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_attributes[rule1] 0.02
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_attributes[rule2] 0.26
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_attributes[rule3] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_attributes[rule4] 0.02
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_attributes[rule5] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_attributes[rule6] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_attributes[rule7] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_attributes[rule8] 0.03
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_attributes[rule9] 0.02
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_attributes[rule10] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_attributes[rule11] 0.02
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_attributes[rule12] 0.02
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_attributes[rule13] 0.02
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_attributes[rule14] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_attributes[rule15] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_attributes[rule16] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_attributes[rule17] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_modes[mode0] 0.04
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_modes[mode1] 0.03
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_modes[mode2] 0.13
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_modes[mode3] 0.03
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_modes[mode4] 0.03
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_modes[mode5] 0.03
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_modes[mode6] 0.03
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_modes[mode7] 0.03
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_modes[mode8] 0.03
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_modes[mode9] 0.03
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_modes[mode10] 0.13
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_modes[mode11] 0.03
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_modes[mode12] 0.03
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_modes[mode13] 0.03
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_modes[mode14] 0.03
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_modes[mode15] 0.03
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_modes[mode16] 0.03
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_modes[mode17] 0.03
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_mode_replace[mode0] 0.06
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_mode_replace[mode1] 0.06
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_mode_replace[mode2] 0.05
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_mode_replace[mode3] 0.05
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_mode_replace[mode4] 0.29
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_mode_replace[mode5] 0.06
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_mode_replace[mode6] 0.30
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_mode_replace[mode7] 0.05
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_mode_replace[mode8] 0.06
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_mode_replace[mode9] 0.06
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_mode_replace[mode10] 0.05
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_mode_replace[mode11] 0.30
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_mode_replace[mode12] 0.05
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_mode_replace[mode13] 0.05
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_mode_replace[mode14] 0.06
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_mode_replace[mode15] 0.05
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_mode_replace[mode16] 0.05
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_mode_replace[mode17] 0.05
No log output captured.
Passed suites/filter/filter_match_test.py::test_match_count[(attrbitStringMatch='0001'B)-1-(attrbitStringMatch:bitStringMatch:='000100000'B)] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_match_count[(attrgeneralizedTimeMatch=20100218171300Z)-1-(attrcaseExactIA5Match=SPRAIN)] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_match_count[(attrcaseExactMatch>=ÇélIné Ändrè)-5-(attrcaseExactMatch=ÇéLINé ÄNDRè)] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_match_count[(attrcaseExactMatch:caseExactMatch:=ÇélIné Ändrè)-1-(attrcaseExactMatch>=çéliné ändrè)] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_match_count[(attrcaseExactIA5Match=Sprain)-1-(attrgeneralizedTimeMatch=20300218171300Z)] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_match_count[(attrbooleanMatch=TRUE)-1-(attrgeneralizedTimeMatch>=20300218171300Z)] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_match_count[(attrcaseIgnoreIA5Match=sprain1)-1-(attrcaseIgnoreIA5Match=sprain9999)] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_match_count[(attrcaseIgnoreMatch=ÇélIné Ändrè1)-1-(attrcaseIgnoreMatch=ÇélIné Ändrè9999)] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_match_count[(attrcaseIgnoreMatch>=ÇélIné Ändrè1)-6-(attrcaseIgnoreMatch>=ÇélIné Ändrè9999)] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_match_count[(attrcaseIgnoreListMatch=foo1$bar)-1-(attrcaseIgnoreListMatch=foo1$bar$baz$biff)] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_match_count[(attrobjectIdentifierMatch=1.3.6.1.4.1.1466.115.121.1.15)-1-(attrobjectIdentifierMatch=1.3.6.1.4.1.1466.115.121.1.15.99999)] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_match_count[(attrgeneralizedTimeMatch>=20100218171300Z)-6-(attroctetStringMatch>=AAAAAAAAAAABAQQ=)] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_match_count[(attrdirectoryStringFirstComponentMatch=ÇélIné Ändrè1)-1-(attrdirectoryStringFirstComponentMatch=ÇélIné Ändrè9999)] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_match_count[(attrobjectIdentifierFirstComponentMatch=1.3.6.1.4.1.1466.115.121.1.15)-1-(attrobjectIdentifierFirstComponentMatch=1.3.6.1.4.1.1466.115.121.1.15.99999)] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_match_count[(attrdistinguishedNameMatch=cn=foo1,cn=bar)-1-(attrdistinguishedNameMatch=cn=foo1,cn=bar,cn=baz)] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_match_count[(attrintegerMatch=-2)-1-(attrintegerMatch=-20)] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_match_count[(attrintegerMatch>=-2)-6-(attrintegerMatch>=20)] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_match_count[(attrintegerFirstComponentMatch=-2)-1-(attrintegerFirstComponentMatch=-20)] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_match_count[(attruniqueMemberMatch=cn=foo1,cn=bar#'0001'B)-1-(attruniqueMemberMatch=cn=foo1,cn=bar#'00010000'B)] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_match_count[(attrnumericStringMatch=00001)-1-(attrnumericStringMatch=000000001)] 0.26
No log output captured.
Passed suites/filter/filter_match_test.py::test_match_count[(attrnumericStringMatch>=00001)-6-(attrnumericStringMatch>=01)] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_match_count[(attrtelephoneNumberMatch=+1 408 555 4798)-1-(attrtelephoneNumberMatch=+2 408 555 4798)] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_match_count[(attroctetStringMatch=AAAAAAAAAAAAAAE=)-1-(attroctetStringMatch=AAAAAAAAAAAAAAEB)] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_match_count[(attroctetStringMatch>=AAAAAAAAAAAAAAE=)-6-(attroctetStringMatch>=AAAAAAAAAAABAQE=)] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrbitStringMatch:bitStringMatch:='0001'B)-1] 0.00
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrcaseExactIA5Match:caseExactIA5Match:=Sprain)-1] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrcaseExactMatch:caseExactMatch:=ÇélIné Ändrè)-1] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrcaseExactMatch:caseExactOrderingMatch:=ÇélIné Ändrè)-5] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrgeneralizedTimeMatch:generalizedTimeMatch:=20100218171300Z)-1] 0.00
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrgeneralizedTimeMatch:generalizedTimeOrderingMatch:=20100218171300Z)-6] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrbooleanMatch:booleanMatch:=TRUE)-1] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrcaseIgnoreIA5Match:caseIgnoreIA5Match:=sprain1)-1] 0.00
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrcaseIgnoreMatch:caseIgnoreMatch:=ÇélIné Ändrè1)-1] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrcaseIgnoreMatch:caseIgnoreOrderingMatch:=ÇélIné Ändrè1)-6] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrcaseIgnoreListMatch:caseIgnoreListMatch:=foo1$bar)-1] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrobjectIdentifierMatch:objectIdentifierMatch:=1.3.6.1.4.1.1466.115.121.1.15)-1] 0.00
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrdirectoryStringFirstComponentMatch:directoryStringFirstComponentMatch:=ÇélIné Ändrè1)-1] 0.00
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrobjectIdentifierFirstComponentMatch:objectIdentifierFirstComponentMatch:=1.3.6.1.4.1.1466.115.121.1.15)-1] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrdistinguishedNameMatch:distinguishedNameMatch:=cn=foo1,cn=bar)-1] 0.00
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrintegerMatch:integerMatch:=-2)-1] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrintegerMatch:integerOrderingMatch:=-2)-6] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrintegerFirstComponentMatch:integerFirstComponentMatch:=-2)-1] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attruniqueMemberMatch:uniqueMemberMatch:=cn=foo1,cn=bar#'0001'B)-1] 0.00
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrnumericStringMatch:numericStringMatch:=00001)-10] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrnumericStringMatch:numericStringMatch:=00001)-11] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrtelephoneNumberMatch:telephoneNumberMatch:=+1 408 555 4798)-1] 0.00
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attroctetStringMatch:octetStringMatch:=AAAAAAAAAAAAAAE=)-1] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attroctetStringMatch:octetStringOrderingMatch:=AAAAAAAAAAAAAAE=)-6] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrcaseExactMatch=*ÇélIné Ändrè*)-1] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrcaseExactMatch=ÇélIné Ändrè*)-1] 0.00
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrcaseExactMatch=*ÇélIné Ändrè)-1] 0.00
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrcaseExactMatch=*é Ä*)-5] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrcaseExactIA5Match=*Sprain*)-1] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrcaseExactIA5Match=Sprain*)-1] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrcaseExactIA5Match=*Sprain)-1] 0.00
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrcaseExactIA5Match=*rai*)-3] 0.00
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrcaseIgnoreIA5Match=*sprain1*)-1] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrcaseIgnoreIA5Match=sprain1*)-1] 0.00
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrcaseIgnoreIA5Match=*sprain1)-1] 0.00
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrcaseIgnoreIA5Match=*rai*)-6] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrcaseIgnoreMatch=*ÇélIné Ändrè1*)-1] 0.00
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrcaseIgnoreMatch=ÇélIné Ändrè1*)-1] 0.00
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrcaseIgnoreMatch=*ÇélIné Ändrè1)-1] 0.00
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrcaseIgnoreMatch=*é Ä*)-6] 0.16
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrcaseIgnoreListMatch=*foo1$bar*)-1] 0.00
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrcaseIgnoreListMatch=foo1$bar*)-1] 0.00
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrcaseIgnoreListMatch=*foo1$bar)-1] 0.00
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrcaseIgnoreListMatch=*1$b*)-1] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrnumericStringMatch=*00001*)-1] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrnumericStringMatch=00001*)-1] 0.00
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrnumericStringMatch=*00001)-1] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrnumericStringMatch=*000*)-6] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrtelephoneNumberMatch=*+1 408 555 4798*)-1] 0.00
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrtelephoneNumberMatch=+1 408 555 4798*)-1] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrtelephoneNumberMatch=*+1 408 555 4798)-1] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrtelephoneNumberMatch=* 55*)-6] 0.01
No log output captured.
Passed suites/filter/filter_test.py::test_filter_escaped 0.31
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.filter_test:filter_test.py:42 Running test_filter_escaped... INFO  tests.suites.filter.filter_test:filter_test.py:78 test_filter_escaped: PASSED
Passed suites/filter/filter_test.py::test_filter_search_original_attrs 0.01
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.filter_test:filter_test.py:95 Running test_filter_search_original_attrs... INFO  tests.suites.filter.filter_test:filter_test.py:108 test_filter_search_original_attrs: PASSED
Passed suites/filter/filter_test.py::test_filter_scope_one 0.16
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.filter_test:filter_test.py:128 Search user using ldapsearch with scope one INFO  tests.suites.filter.filter_test:filter_test.py:130 [dn: cn=Directory Administrators,dc=example,dc=com cn: Directory Administrators ] INFO  tests.suites.filter.filter_test:filter_test.py:132 Search should only have one entry
Passed suites/filter/filter_test.py::test_filter_with_attribute_subtype 0.62
-------------------------------Captured log call--------------------------------
INFO  lib389:filter_test.py:157 Bind as cn=Directory Manager INFO  lib389:filter_test.py:164 ######################### ADD ###################### INFO  lib389:filter_test.py:187 Try to add Add cn=test_entry both, dc=example,dc=com: dn: cn=test_entry both, dc=example,dc=com cn: test_entry both cn;en: test_entry en cn;fr: test_entry fr objectclass: top objectclass: person sn: test_entry both INFO  lib389:filter_test.py:190 Try to add Add cn=test_entry en only, dc=example,dc=com: dn: cn=test_entry en only, dc=example,dc=com cn: test_entry en only cn;en: test_entry en objectclass: top objectclass: person sn: test_entry en only INFO  lib389:filter_test.py:193 ######################### SEARCH ###################### INFO  lib389:filter_test.py:197 Try to search with filter (&(sn=test_entry en only)(!(cn=test_entry fr))) INFO  lib389:filter_test.py:201 Found cn=test_entry en only,dc=example,dc=com INFO  lib389:filter_test.py:205 Try to search with filter (&(sn=test_entry en only)(!(cn;fr=test_entry fr))) INFO  lib389:filter_test.py:209 Found cn=test_entry en only,dc=example,dc=com INFO  lib389:filter_test.py:213 Try to search with filter (&(sn=test_entry en only)(!(cn;en=test_entry en))) INFO  lib389:filter_test.py:216 Found none INFO  lib389:filter_test.py:218 ######################### DELETE ###################### INFO  lib389:filter_test.py:220 Try to delete cn=test_entry both, dc=example,dc=com INFO  lib389:filter_test.py:223 Try to delete cn=test_entry en only, dc=example,dc=com INFO  tests.suites.filter.filter_test:filter_test.py:226 Testcase PASSED
Passed suites/filter/filter_test.py::test_extended_search 0.27
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.filter_test:filter_test.py:253 Running test_filter_escaped... INFO  lib389:filter_test.py:270 Try to search with filter (cn:de:=ext-test-entry) INFO  lib389:filter_test.py:276 Try to search with filter (cn:caseIgnoreIA5Match:=EXT-TEST-ENTRY) INFO  lib389:filter_test.py:282 Try to search with filter (cn:caseIgnoreMatch:=EXT-TEST-ENTRY) INFO  lib389:filter_test.py:288 Try to search with filter (cn:caseExactMatch:=EXT-TEST-ENTRY) INFO  lib389:filter_test.py:294 Try to search with filter (cn:caseExactMatch:=ext-test-entry) INFO  lib389:filter_test.py:300 Try to search with filter (cn:caseExactIA5Match:=EXT-TEST-ENTRY) INFO  lib389:filter_test.py:306 Try to search with filter (cn:caseExactIA5Match:=ext-test-entry)
Passed suites/filter/filter_with_non_root_user_test.py::test_telephone[(telephonenumber=*7393)] 0.04
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/filter/filter_with_non_root_user_test.py::test_telephone[(telephonenumber=*408*3)] 0.01
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(uid=mward)] 0.04
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(l=sunnyvale)0] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(mail=jreu*)] 0.26
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(mail=*exam*)] 0.01
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(uid=*)] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(uid:2.16.840.1.113730.3.3.2.15.1:=>AAA)] 0.03
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(uid:es:=>AAA)] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(uid:2.16.840.1.113730.3.3.2.15.1.5:=AAA)] 0.03
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(uid:2.16.840.1.113730.3.3.2.15.1:=>user100)] 0.03
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(uid:es:=>user100)] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(uid:2.16.840.1.113730.3.3.2.15.1.5:=user100)] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(uid:2.16.840.1.113730.3.3.2.15.1.1:=user1)] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(uid:2.16.840.1.113730.3.3.2.15.1.1:=z)] 0.03
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(uid=user1)] 0.01
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(uid<=Z)] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(uid>=1)] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(uid>=A)] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(uid>=user20)] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(uid:2.16.840.1.113730.3.3.2.18.1.2:=user20)] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(uid:2.16.840.1.113730.3.3.2.18.1.2:=z)] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(uid:2.16.840.1.113730.3.3.2.18.1:=>=A)] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(uid:fr:=>=A)] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(uid:2.16.840.1.113730.3.3.2.18.1.4:=A)] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(uid:2.16.840.1.113730.3.3.2.18.1:=>=user20)] 0.03
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(uid:fr:=>=user20)] 0.04
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(uid:2.16.840.1.113730.3.3.2.18.1.4:=user20)] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(uid:2.16.840.1.113730.3.3.2.18.1:=>=z)] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(uid:fr:=>=z)] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(description=This is the special * attribute value)] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(description=*x*)] 0.26
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(uid=ptyler)] 0.11
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(uid=*wal*)] 0.01
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(roomNumber=0312)] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(uid=mw*)] 0.01
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(roomNumber=2295)] 0.01
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(l=Cupertino)] 0.01
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(objectclass=inetorgperson)] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(l=sunnyvale)1] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(roomNumber=200)] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(roomNumber=201)] 0.01
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(roomNumber=202)] 0.01
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(l=*)] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(&(!(sn~=tiller))(!(uid=ptyler)))] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(&(!(sn~=tiller)) (uid=ptyler))] 0.01
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(uid=*wal*) (roomNumber=0312))] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(!(uid=*wal*))(!(roomNumber=0312)))] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(!(uid=*wal*))(roomNumber=0312))] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(uid=*wal*)(!(roomNumber=0312)))] 0.03
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(&(uid=*wal*)(|(sn~=tiller) (roomNumber=2295)))] 0.01
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(&(uid=*wal*) (roomNumber=2295))(&(uid=*wal*) (sn~=tiller)))] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(uid=*wal*)(&(sn~=tiller) (roomNumber=2295)))0] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(&(|(uid=*wal*) (sn~=tiller))(|(uid=*wal*) (roomNumber=2295)))0] 0.01
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(&(uid=*wal*) (roomNumber=2295))0] 0.01
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(&(roomNumber=2295) (uid=*wal*))0] 0.01
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(roomNumber=2295) (uid=*wal*))0] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(uid=*wal*)(&(sn~=tiller) (roomNumber=2295)))1] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(uid=*wal*) (roomNumber=2295))] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(uid=*wal*) (l=*))] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(&(l=Cupertino) (|(uid=*wal*)(&(sn~=tiller) (roomNumber=2295))))] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(&(!(l=Cupertino))(!(|(uid=*wal*)(&(sn~=tiller) (roomNumber=2295)))))] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(&(!(l=Cupertino))(|(uid=*wal*)(&(sn~=tiller) (roomNumber=2295))))] 0.01
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(&(l=Cupertino)(!(|(uid=*wal*)(&(sn~=tiller) (roomNumber=2295)))))] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(l=Cupertino) (|(uid=*wal*)(&(sn~=tiller) (roomNumber=2295))))] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(!(l=Cupertino))(!(|(uid=*wal*)(&(sn~=tiller) (roomNumber=2295)))))] 0.03
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(!(l=Cupertino))(|(uid=*wal*)(&(sn~=tiller) (roomNumber=2295))))] 0.03
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(l=Cupertino)(!(|(uid=*wal*)(&(sn~=tiller) (roomNumber=2295)))))] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(&(!(uid=user1))(objectclass=inetorgperson))] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(!(uid=user1))(objectclass=inetorgperson))] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(&(!(mail=cnewport@example.com))(l=sunnyvale))] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(&(|(uid=*wal*) (sn~=tiller))(|(uid=*wal*) (roomNumber=2295)))1] 0.01
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(&(uid=*wal*) (roomNumber=2295))1] 0.01
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(&(roomNumber=2295) (uid=*wal*))1] 0.01
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(roomNumber=2295) (uid=*wal*))1] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(!(roomNumber=200))(!(roomNumber=201))(!(roomNumber=202))(l=sunnyvale))] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(&(!(uid=user40))(&(!(uid=user1))(!(uid=user20))(!(uid=user30))(objectclass=inetorgperson)))] 0.03
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(!(uid=user40))(&(!(uid=user1))(!(uid=user20))(!(uid=user30))(objectclass=inetorgperson)))] 0.03
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(&(!(roomNumber=2254))(&(!(roomNumber=200))(!(roomNumber=201))(!(roomNumber=202))(l=sunnyvale)))] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(!(roomNumber=2254))(&(!(roomNumber=200))(!(roomNumber=201))(!(roomNumber=202))(l=sunnyvale)))] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(&(!(uid=user1))(!(uid:2.16.840.1.113730.3.3.2.18.1:=<=user20))(!(uid=user30)))] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(!(uid=user1))(!(uid:2.16.840.1.113730.3.3.2.18.1:=<=user20))(!(uid=user30)))] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(&(!(roomNumber=4012))(!(roomNumber=3924))(!(roomNumber=4508)))] 0.03
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(!(roomNumber=4012))(!(roomNumber=3924))(!(roomNumber=4508)))] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(& (objectclass=inetorgperson)(!(uid=user1))(!(uid:2.16.840.1.113730.3.3.2.18.1:=<=user20))(!(uid=user30)))] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(| (objectclass=inetorgperson)(!(uid=user1))(!(uid:2.16.840.1.113730.3.3.2.18.1:=<=user20))(!(uid=user30)))] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(&(l=sunnyvale)(!(roomNumber=4012))(!(roomNumber=3924))(!(roomNumber=4508)))] 0.27
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(l=sunnyvale)(!(roomNumber=4012))(!(roomNumber=3924))(!(roomNumber=4508)))] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(!(|(!(l=*))(!(l=sunnyvale))))] 0.12
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(!(l=*))(!(l=sunnyvale)))] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(&(!(l=Cupertino))(!(mail=*exam*))(!(|(uid=*wal*) (l=*))))] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(&(!(l=Cupertino))(!(mail=*exam*))(|(uid=*wal*) (l=*)))] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(&(!(l=Cupertino))(mail=*exam*) (|(uid=*wal*) (l=*)))] 0.01
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(l=Cupertino) (mail=*exam*) (|(uid=*wal*) (l=*)))] 0.03
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(!(l=Cupertino))(!(mail=*exam*))(!(|(uid=*wal*) (l=*))))] 0.22
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(!(l=Cupertino))(!(mail=*exam*))(|(uid=*wal*) (l=*)))] 0.12
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(!(l=Cupertino))(mail=*exam*)(!(|(uid=*wal*) (l=*))))] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(!(l=Cupertino))(mail=*exam*) (|(uid=*wal*) (l=*)))] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(l=Cupertino)(!(mail=*exam*))(!(|(uid=*wal*) (l=*))))] 0.03
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(l=Cupertino)(!(mail=*exam*))(|(uid=*wal*) (l=*)))] 0.03
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(l=Cupertino) (mail=*exam*)(!(|(uid=*wal*) (l=*))))] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(&(l=Cupertino)(!(mail=*exam*))(|(uid=*wal*) (l=*)))] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(userpassword=*)] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(fred=*)] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(uid:2.16.840.1.113730.3.3.2.15.1:=<1)] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(uid:es:=<1)] 0.01
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(uid:2.16.840.1.113730.3.3.2.15.1.1:=1)] 0.01
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(uid:2.16.840.1.113730.3.3.2.15.1:=<user1)] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(uid:es:=<user1)] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(uid:2.16.840.1.113730.3.3.2.15.1:=<z)] 0.01
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(uid:es:=<z)] 0.01
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(uid<=1)] 0.01
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(uid<=A)] 0.01
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(uid>=Z)] 0.01
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(uid:2.16.840.1.113730.3.3.2.18.1:=<=A)] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(uid:fr:=<=A)] 0.01
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(uid:2.16.840.1.113730.3.3.2.18.1.2:=A)] 0.01
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(uid:2.16.840.1.113730.3.3.2.18.1:=<=user20)] 0.01
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(uid:fr:=<=user20)] 0.01
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(uid:2.16.840.1.113730.3.3.2.18.1:=<=z)] 0.01
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(uid:fr:=<=z)] 0.01
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(uid:2.16.840.1.113730.3.3.2.18.1.4:=z)] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(sn~=tiller)] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(givenName~=pricella)] 0.01
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(mail=cnewport@example.com)] 0.01
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(uid=user20)] 0.01
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(uid=user30)] 0.01
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(uid=user40)] 0.01
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(&(sn~=tiller) (givenName~=pricella))] 0.01
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(&(sn~=tiller)(!(uid=ptyler)))] 0.01
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(&(!(l=Cupertino))(mail=*exam*)(!(|(uid=*wal*) (l=*))))] 0.01
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(&(l=Cupertino)(!(mail=*exam*))(!(|(uid=*wal*) (l=*))))] 0.01
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(&(l=Cupertino) (mail=*exam*)(!(|(uid=*wal*) (l=*))))] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(&(l=Cupertino) (mail=*exam*) (|(uid=*wal*) (l=*)))] 0.02
No log output captured.
Passed suites/filter/filterscanlimit_test.py::test_idlistscanlimit 9.04
No log output captured.
Passed suites/filter/large_filter_test.py::test_large_filter[(&(objectClass=person)(|(manager=uid=fmcdonnagh,dc=anuj,dc=com)(manager=cn=no_such_entry_with_a_really_long_dn_component_to_stress_the_filter_handling_code_0,dc=anuj,dc=com)(manager=cn=no_such_entry_with_a_really_long_dn_component_to_stress_the_filter_handling_code_1,dc=anuj,dc=com)(manager=cn=no_such_entry_with_a_really_long_dn_component_to_stress_the_filter_handling_code_2,dc=anuj,dc=com)(manager=cn=no_such_entry_with_a_really_long_dn_component_to_stress_the_filter_handling_code_3,dc=anuj,dc=com)(manager=cn=no_such_entry_with_a_really_long_dn_component_to_stress_the_filter_handling_code_4,dc=anuj,dc=com)(manager=cn=no_such_entry_with_a_really_long_dn_component_to_stress_the_filter_handling_code_5,dc=anuj,dc=com)(manager=uid=jvedder, dc=anuj, dc=com)(manager=cn=no_such_entry_with_a_really_long_dn_component_to_stress_the_filter_handling_code_6,dc=anuj,dc=com)(manager=cn=no_such_entry_with_a_really_long_dn_component_to_stress_the_filter_handling_code_7,dc=anuj,dc=com)(manager=cn=no_such_entry_with_a_really_long_dn_component_to_stress_the_filter_handling_code_8,dc=anuj,dc=com)(manager=cn=no_such_entry_with_a_really_long_dn_component_to_stress_the_filter_handling_code_9,dc=anuj,dc=com)(manager=cn=no_such_entry_with_a_really_long_dn_component_to_stress_the_filter_handling_code_10,dc=anuj,dc=com)(manager=uid=cnewport, dc=anuj, dc=com)))] 0.04
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/filter/large_filter_test.py::test_large_filter[(&(objectClass=person)(|(manager=uid=fmcdonnagh *)(manager=cn=no_such_entry_with_a_really_long_dn_component_to_stress_the_filter_handling_code_0,*)(manager=cn=no_such_entry_with_a_really_long_dn_component_to_stress_the_filter_handling_code_1,*)(manager=cn=no_such_entry_with_a_really_long_dn_component_to_stress_the_filter_handling_code_2,*)(manager=cn=no_such_entry_with_a_really_long_dn_component_to_stress_the_filter_handling_code_3,*)(manager=cn=no_such_entry_with_a_really_long_dn_component_to_stress_the_filter_handling_code_4,*)(manager=cn=no_such_entry_with_a_really_long_dn_component_to_stress_the_filter_handling_code_5,*)(manager=uid=jvedder,*)(manager=cn=no_such_entry_with_a_really_long_dn_component_to_stress_the_filter_handling_code_6,*)(manager=cn=no_such_entry_with_a_really_long_dn_component_to_stress_the_filter_handling_code_7,*)(manager=cn=no_such_entry_with_a_really_long_dn_component_to_stress_the_filter_handling_code_8,*)(manager=cn=no_such_entry_with_a_really_long_dn_component_to_stress_the_filter_handling_code_9,*)(manager=cn=no_such_entry_with_a_really_long_dn_component_to_stress_the_filter_handling_code_10,*)(manager=uid=cnewport,*)))] 0.02
No log output captured.
Passed suites/filter/rfc3673_all_oper_attrs_test.py::test_supported_features 0.00
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/filter/rfc3673_all_oper_attrs_test.py::test_search_basic[-False-oper_attr_list0-] 0.11
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:rfc3673_all_oper_attrs_test.py:147 bound as: cn=Directory Manager
Passed suites/filter/rfc3673_all_oper_attrs_test.py::test_search_basic[-False-oper_attr_list0-*] 0.01
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:rfc3673_all_oper_attrs_test.py:147 bound as: cn=Directory Manager
Passed suites/filter/rfc3673_all_oper_attrs_test.py::test_search_basic[-False-oper_attr_list0-objectClass] 0.01
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:rfc3673_all_oper_attrs_test.py:147 bound as: cn=Directory Manager
Passed suites/filter/rfc3673_all_oper_attrs_test.py::test_search_basic[-True-oper_attr_list1-] 0.03
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:rfc3673_all_oper_attrs_test.py:144 bound as: uid=all_attrs_test,ou=people,dc=example,dc=com
Passed suites/filter/rfc3673_all_oper_attrs_test.py::test_search_basic[-True-oper_attr_list1-*] 0.01
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:rfc3673_all_oper_attrs_test.py:144 bound as: uid=all_attrs_test,ou=people,dc=example,dc=com
Passed suites/filter/rfc3673_all_oper_attrs_test.py::test_search_basic[-True-oper_attr_list1-objectClass] 0.01
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:rfc3673_all_oper_attrs_test.py:144 bound as: uid=all_attrs_test,ou=people,dc=example,dc=com
Passed suites/filter/rfc3673_all_oper_attrs_test.py::test_search_basic[ou=people,dc=example,dc=com-False-oper_attr_list2-] 0.11
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:rfc3673_all_oper_attrs_test.py:147 bound as: cn=Directory Manager
Passed suites/filter/rfc3673_all_oper_attrs_test.py::test_search_basic[ou=people,dc=example,dc=com-False-oper_attr_list2-*] 0.01
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:rfc3673_all_oper_attrs_test.py:147 bound as: cn=Directory Manager
Passed suites/filter/rfc3673_all_oper_attrs_test.py::test_search_basic[ou=people,dc=example,dc=com-False-oper_attr_list2-objectClass] 0.01
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:rfc3673_all_oper_attrs_test.py:147 bound as: cn=Directory Manager
Passed suites/filter/rfc3673_all_oper_attrs_test.py::test_search_basic[ou=people,dc=example,dc=com-True-oper_attr_list3-] 0.03
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:rfc3673_all_oper_attrs_test.py:144 bound as: uid=all_attrs_test,ou=people,dc=example,dc=com
Passed suites/filter/rfc3673_all_oper_attrs_test.py::test_search_basic[ou=people,dc=example,dc=com-True-oper_attr_list3-*] 0.01
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:rfc3673_all_oper_attrs_test.py:144 bound as: uid=all_attrs_test,ou=people,dc=example,dc=com
Passed suites/filter/rfc3673_all_oper_attrs_test.py::test_search_basic[ou=people,dc=example,dc=com-True-oper_attr_list3-objectClass] 0.01
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:rfc3673_all_oper_attrs_test.py:144 bound as: uid=all_attrs_test,ou=people,dc=example,dc=com
Passed suites/filter/rfc3673_all_oper_attrs_test.py::test_search_basic[uid=all_attrs_test,ou=people,dc=example,dc=com-False-oper_attr_list4-] 0.11
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:rfc3673_all_oper_attrs_test.py:147 bound as: cn=Directory Manager
Passed suites/filter/rfc3673_all_oper_attrs_test.py::test_search_basic[uid=all_attrs_test,ou=people,dc=example,dc=com-False-oper_attr_list4-*] 0.01
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:rfc3673_all_oper_attrs_test.py:147 bound as: cn=Directory Manager
Passed suites/filter/rfc3673_all_oper_attrs_test.py::test_search_basic[uid=all_attrs_test,ou=people,dc=example,dc=com-False-oper_attr_list4-objectClass] 0.01
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:rfc3673_all_oper_attrs_test.py:147 bound as: cn=Directory Manager
Passed suites/filter/rfc3673_all_oper_attrs_test.py::test_search_basic[uid=all_attrs_test,ou=people,dc=example,dc=com-True-oper_attr_list5-] 0.03
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:rfc3673_all_oper_attrs_test.py:144 bound as: uid=all_attrs_test,ou=people,dc=example,dc=com
Passed suites/filter/rfc3673_all_oper_attrs_test.py::test_search_basic[uid=all_attrs_test,ou=people,dc=example,dc=com-True-oper_attr_list5-*] 0.01
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:rfc3673_all_oper_attrs_test.py:144 bound as: uid=all_attrs_test,ou=people,dc=example,dc=com
Passed suites/filter/rfc3673_all_oper_attrs_test.py::test_search_basic[uid=all_attrs_test,ou=people,dc=example,dc=com-True-oper_attr_list5-objectClass] 0.01
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:rfc3673_all_oper_attrs_test.py:144 bound as: uid=all_attrs_test,ou=people,dc=example,dc=com
Passed suites/filter/rfc3673_all_oper_attrs_test.py::test_search_basic[cn=config-False-oper_attr_list6-] 0.10
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:rfc3673_all_oper_attrs_test.py:147 bound as: cn=Directory Manager
Passed suites/filter/rfc3673_all_oper_attrs_test.py::test_search_basic[cn=config-False-oper_attr_list6-*] 0.01
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:rfc3673_all_oper_attrs_test.py:147 bound as: cn=Directory Manager
Passed suites/filter/rfc3673_all_oper_attrs_test.py::test_search_basic[cn=config-False-oper_attr_list6-objectClass] 0.01
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:rfc3673_all_oper_attrs_test.py:147 bound as: cn=Directory Manager
Passed suites/filter/schema_validation_test.py::test_filter_validation_config 0.56
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/filter/schema_validation_test.py::test_filter_validation_enabled 3.91
No log output captured.
Passed suites/filter/schema_validation_test.py::test_filter_validation_warn_safe 0.13
No log output captured.
Passed suites/filter/schema_validation_test.py::test_filter_validation_warn_unsafe 0.13
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(& (| (nsRoleDN=cn=new managed role) (sn=Hall)) (l=sunnyvale))-condition0-cn] 0.01
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(& (| (nsRoleDN=cn=new managed role) (sn=Hall)) (l=sunnyvale))-condition1-cn] 0.01
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(& (| (nsRoleDN=cn=new managed role) (sn=Hall)) (l=sunnyvale))-condition2-cn] 0.01
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(& (| (nsRoleDN=cn=new managed role) (sn=Hall)) (l=sunnyvale))-condition3-cn] 0.01
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(& (| (nsRoleDN=cn=new managed role) (sn=Hall)) (l=sunnyvale))-condition4-modifiersName] 0.01
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(& (| (nsRoleDN=cn=new managed role) (sn=Hall)) (l=sunnyvale))-condition5-modifyTimestamp] 0.01
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(& (| (nsRoleDN=cn=new managed role) (sn=Hall)) (l=sunnyvale))-condition6-modifiersName] 0.01
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(& (| (nsRoleDN=cn=new managed role) (sn=Hall)) (l=sunnyvale))-condition7-modifyTimestamp] 0.01
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(& (| (nsRoleDN=cn=new managed role) (sn=Hall)) (l=sunnyvale))-condition8-modifiersName] 0.01
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(& (| (nsRoleDN=cn=new managed role) (sn=Hall)) (l=sunnyvale))-condition9-modifyTimestamp] 0.01
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(& (| (nsRoleDN=cn=new managed role) (sn=Hall)) (l=sunnyvale))-condition10-cn] 0.01
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(& (| (nsRoleDN=cn=new managed role) (sn=Hall)) (l=sunnyvale))-condition11-cn] 0.00
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(& (| (nsRoleDN=cn=new managed role) (sn=Hall)) (l=sunnyvale))-condition12-modifiersName] 0.00
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(& (| (nsRoleDN=cn=new managed role) (sn=Hall)) (l=sunnyvale))-condition13-nsRoleDN] 0.01
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(& (| (nsRoleDN=cn=new managed role) (sn=Hall)) (l=sunnyvale))-condition14-cn] 0.01
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(& (| (nsRoleDN=cn=new managed role) (sn=Hall)) (l=sunnyvale))-condition15-modifiersName] 0.01
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(& (| (nsRoleDN=cn=new managed role) (sn=Hall)) (l=sunnyvale))-condition16-nsRoleDN] 0.01
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(uid=rjense2)-condition17-mailquota] 0.00
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(uid=rjense2)-condition18-mailquota] 0.00
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(uid=rjense2)-condition19-mailquota] 0.00
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(uid=rjense2)-condition20-mailquota] 0.01
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(uid=rjense2)-condition21-nsRoleDN] 0.00
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(nsRoleDN=cn=new managed *)-condition22-cn] 0.01
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(nsRoleDN=cn=new managed *)-condition23-nsRoleDN] 0.01
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(& (nsRoleDN=cn=new managed *) (uid=mtyler))-condition24-mailquota] 0.00
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(& (nsRoleDN=cn=new managed *) (uid=mtyler))-condition25-nsRoleDN] 0.00
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(& (nsRoleDN=cn=new managed *) (uid=mtyler))-condition26-mailquota] 0.00
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(& (nsRoleDN=cn=new managed *) (uid=mtyler))-condition27-modifiersName] 0.00
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(& (nsRoleDN=cn=new managed *) (uid=mtyler))-condition28-nsRoleDN] 0.00
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(& (nsRoleDN=cn=new managed *) (uid=mtyler))-condition29-nsRoleDN] 0.00
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(& (nsRoleDN=cn=new managed *) (uid=mtyler))-condition30-modifiersName] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(postalCode=99999)] 0.01
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(postalAddress=345 California Av., Mountain View, CA)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(postalCode:2.16.840.1.113730.3.3.2.7.1:=88888)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(postalCode:2.16.840.1.113730.3.3.2.7.1.3:=66666)] 0.00
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(emailclass=vpe*)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(emailclass=*emai*)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(mailquota=*00)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(mailquota=*6*0)] 0.00
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(nsRole=*)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(postalAddress=*)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(emailclass:2.16.840.1.113730.3.3.2.15.1:=>AAA)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(emailclass:es:=>AAA)] 0.16
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(emailclass:2.16.840.1.113730.3.3.2.15.1.5:=AAA)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(emailclass:2.16.840.1.113730.3.3.2.15.1:=>vpemail)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(emailclass:es:=>vpemail)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(mailquota:2.16.840.1.113730.3.3.2.15.1.1:=900)] 0.00
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(mailquota<=600)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(mailquota>=600)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(nsRole~=cn=new)] 0.00
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(uid=*wal*)] 0.00
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(uid=mw*)] 0.00
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(roomNumber=0312)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(l=Cupertino)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(uid=user1)] 0.00
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(objectclass=inetorgperson)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(l=sunnyvale)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(roomNumber=3924)] 0.00
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(l=*)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(objectclass=*)] 0.02
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(mailquota<=900)] 0.00
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(mailquota>=100)] 0.00
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(mailquota:2.16.840.1.113730.3.3.2.18.1.2:=600)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(mailquota:2.16.840.1.113730.3.3.2.18.1.2:=900)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(mailquota:2.16.840.1.113730.3.3.2.18.1:=>=900)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(mailquota:fr:=>=900)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(mailquota:2.16.840.1.113730.3.3.2.18.1:=>=600)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(mailquota:fr:=>=600)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(mailquota:2.16.840.1.113730.3.3.2.18.1.4:=600)] 0.00
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(mailquota:2.16.840.1.113730.3.3.2.18.1:=>=100)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(mailquota:fr:=>=100)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(mailquota:2.16.840.1.113730.3.3.2.18.1.4:=100)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(&(!(nsRole~=cn=new managed))(!(nsRole=cn=new vaddr filtered role,dc=example,dc=com)))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(uid=*wal*) (nsRole=cn=*another*))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(!(uid=*wal*))(!(nsRole=cn=*another*)))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(!(uid=*wal*))(nsRole=cn=*another*))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(uid=*wal*)(!(nsRole=cn=*another*)))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(&(uid=*wal*)(|(nsRole~=cn=new managed) (l=Cupertino)))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(&(uid=*wal*) (l=Cupertino))(&(uid=*wal*) (nsRole~=cn=new managed)))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(uid=*wal*)(&(nsRole~=cn=new managed) (l=Cupertino)))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(&(|(uid=*wal*) (nsRole~=cn=new managed))(|(uid=*wal*) (l=Cupertino)))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(nsRole=cn=*vaddr*) (uid=*wal*))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(uid=*wal*)(&(nsRole~=cn=new managed) (nsRole=cn=*vaddr*)))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(uid=*wal*) (nsRole=cn=*vaddr*))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(nsRole=cn=*vaddr*) (l=*))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(&(l=Cupertino) (|(uid=*wal*)(&(nsRole~=cn=new managed) (nsRole=cn=*vaddr*))))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(&(!(l=Cupertino))(!(|(uid=*wal*)(&(nsRole~=cn=new managed) (nsRole=cn=*vaddr*)))))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(&(!(l=Cupertino))(|(uid=*wal*)(&(nsRole~=cn=new managed) (nsRole=cn=*vaddr*))))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(&(l=Cupertino)(!(|(uid=*wal*)(&(nsRole~=cn=new managed) (nsRole=cn=*vaddr*)))))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(l=Cupertino) (|(uid=*wal*)(&(nsRole~=cn=new managed) (nsRole=cn=*vaddr*))))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(!(l=Cupertino))(!(|(uid=*wal*)(&(nsRole~=cn=new managed) (nsRole=cn=*vaddr*)))))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(!(l=Cupertino))(|(uid=*wal*)(&(nsRole~=cn=new managed) (nsRole=cn=*vaddr*))))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(l=Cupertino)(!(|(uid=*wal*)(&(nsRole~=cn=new managed) (nsRole=cn=*vaddr*)))))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(&(!(uid=user1))(objectclass=inetorgperson))] 0.15
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(!(uid=user1))(objectclass=inetorgperson))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(&(!(nsRole=cn=*vaddr*))(l=sunnyvale))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(!(nsRole=cn=*vaddr*))(l=sunnyvale))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(&(!(uid=user1))(!(uid=user20))(!(uid=user30))(objectclass=inetorgperson))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(!(uid=user1))(!(uid=user20))(!(uid=user30))(objectclass=inetorgperson))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(&(!(nsRole=cn=another vaddr role,dc=example,dc=com))(!(roomNumber=3924))(!(roomNumber=4508))(l=sunnyvale))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(!(nsRole=cn=another vaddr role,dc=example,dc=com))(!(roomNumber=3924))(!(roomNumber=4508))(l=sunnyvale))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(&(!(uid=user40))(&(!(uid=user1))(!(uid=user20))(!(uid=user30))(objectclass=inetorgperson)))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(!(uid=user40))(&(!(uid=user1))(!(uid=user20))(!(uid=user30))(objectclass=inetorgperson)))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(&(!(roomNumber=2254))(&(!(nsRole=cn=another vaddr role,dc=example,dc=com))(!(roomNumber=3924))(!(roomNumber=4508))(l=sunnyvale)))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(!(roomNumber=2254))(&(!(nsRole=cn=another vaddr role,dc=example,dc=com))(!(roomNumber=3924))(!(roomNumber=4508))(l=sunnyvale)))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(&(!(uid=user1))(!(uid=user20))(!(uid=user30)))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(!(uid=user1))(!(uid=user20))(!(uid=user30)))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(&(!(nsRole=cn=another vaddr role,dc=example,dc=com))(!(roomNumber=3924))(!(roomNumber=4508)))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(!(nsRole=cn=another vaddr role,dc=example,dc=com))(!(roomNumber=3924))(!(roomNumber=4508)))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(& (objectclass=inetorgperson)(!(uid=user1))(!(uid=user20))(!(uid=user30)))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(| (objectclass=inetorgperson)(!(uid=user1))(!(uid=user20))(!(uid=user30)))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(&(l=sunnyvale)(!(nsRole=cn=another vaddr role,dc=example,dc=com))(!(roomNumber=3924))(!(roomNumber=4508)))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(l=sunnyvale)(!(nsRole=cn=another vaddr role,dc=example,dc=com))(!(roomNumber=3924))(!(roomNumber=4508)))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(!(|(!(l=*))(!(l=sunnyvale))))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(!(l=*))(!(l=sunnyvale)))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(&(l=Cupertino) (emailclass=*emai*) (|(nsRole=cn=*vaddr*) (l=*)))] 0.00
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(&(!(l=Cupertino))(!(emailclass=*emai*))(!(|(nsRole=cn=*vaddr*) (l=*))))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(&(!(l=Cupertino))(!(emailclass=*emai*))(|(nsRole=cn=*vaddr*) (l=*)))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(&(!(l=Cupertino))(emailclass=*emai*) (|(nsRole=cn=*vaddr*) (l=*)))] 0.00
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(&(l=Cupertino)(!(emailclass=*emai*))(|(nsRole=cn=*vaddr*) (l=*)))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(l=Cupertino) (emailclass=*emai*) (|(nsRole=cn=*vaddr*) (l=*)))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(!(l=Cupertino))(!(emailclass=*emai*))(!(|(nsRole=cn=*vaddr*) (l=*))))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(!(l=Cupertino))(!(emailclass=*emai*))(|(nsRole=cn=*vaddr*) (l=*)))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(!(l=Cupertino))(emailclass=*emai*)(!(|(nsRole=cn=*vaddr*) (l=*))))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(!(l=Cupertino))(emailclass=*emai*) (|(nsRole=cn=*vaddr*) (l=*)))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(l=Cupertino)(!(emailclass=*emai*))(!(|(nsRole=cn=*vaddr*) (l=*))))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(l=Cupertino)(!(emailclass=*emai*))(|(nsRole=cn=*vaddr*) (l=*)))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(l=Cupertino) (emailclass=*emai*)(!(|(nsRole=cn=*vaddr*) (l=*))))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(postalCode:de:==77777)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(fred=*)] 0.00
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(emailclass:2.16.840.1.113730.3.3.2.15.1.5:=vpemail)] 0.00
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(emailclass:2.16.840.1.113730.3.3.2.15.1:=<1)] 0.00
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(emailclass:es:=<1)] 0.00
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(emailclass:2.16.840.1.113730.3.3.2.15.1.1:=1)] 0.00
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(emailclass:2.16.840.1.113730.3.3.2.15.1:=<vpemail)] 0.00
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(emailclass:es:=<vpemail)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(emailclass:2.16.840.1.113730.3.3.2.15.1.1:=vpemail)] 0.00
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(mailquota:2.16.840.1.113730.3.3.2.15.1:=<900)] 0.00
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(mailquota:es:=<900)] 0.00
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(mailquota<=100)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(mailquota>=900)] 0.00
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(nsRole~=cn=new managed)] 0.00
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(nsRole=cn=new vaddr filtered role,dc=example,dc=com)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(nsRole=cn=*another*)] 0.00
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(nsRole=cn=*vaddr*)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(uid=user20)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(uid=user30)] 0.00
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(nsRole=cn=another vaddr role,dc=example,dc=com)] 0.00
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(roomNumber=4508)] 0.00
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(uid=user40)] 0.00
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(roomNumber=2254)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(mailquota:2.16.840.1.113730.3.3.2.18.1:=<=100)] 0.00
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(mailquota:fr:=<=100)] 0.00
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(mailquota:2.16.840.1.113730.3.3.2.18.1.2:=100)] 0.00
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(mailquota:2.16.840.1.113730.3.3.2.18.1:=<=600)] 0.00
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(mailquota:fr:=<=600)] 0.00
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(mailquota:2.16.840.1.113730.3.3.2.18.1:=<=900)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(mailquota:fr:=<=900)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(mailquota:2.16.840.1.113730.3.3.2.18.1.4:=900)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(Description=This is the special \2a attribute value)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(Description=*\2a*)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(&(nsRole~=cn=new managed) (nsRole=cn=new vaddr filtered role,dc=example,dc=com))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(&(!(nsRole~=cn=new managed)) (nsRole=cn=new vaddr filtered role,dc=example,dc=com))] 0.00
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(&(nsRole~=cn=new managed)(!(nsRole=cn=new vaddr filtered role,dc=example,dc=com)))] 0.00
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(&(uid=*wal*) (nsRole=cn=*vaddr*))] 0.00
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(&(nsRole=cn=*vaddr*) (uid=*wal*))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(&(!(l=Cupertino))(emailclass=*emai*)(!(|(nsRole=cn=*vaddr*) (l=*))))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(&(l=Cupertino)(!(emailclass=*emai*))(!(|(nsRole=cn=*vaddr*) (l=*))))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(&(l=Cupertino) (emailclass=*emai*)(!(|(nsRole=cn=*vaddr*) (l=*))))] 0.00
No log output captured.
Passed suites/get_effective_rights/acceptance_test.py::test_group_aci_entry_exists 0.27
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created. INFO  tests.suites.get_effective_rights.acceptance_test:acceptance_test.py:30 Adding user testuser
-------------------------------Captured log call--------------------------------
INFO  tests.suites.get_effective_rights.acceptance_test:acceptance_test.py:57 Adding group group1 INFO  tests.suites.get_effective_rights.acceptance_test:acceptance_test.py:70 Add an ACI granting add access to a user matching the groupdn INFO  lib389:acceptance_test.py:79 dn: uid=testuser,dc=example,dc=com INFO  lib389:acceptance_test.py:81 ######## entryLevelRights: b'vadn'
Passed suites/get_effective_rights/acceptance_test.py::test_group_aci_template_entry 0.01
-------------------------------Captured log call--------------------------------
INFO  tests.suites.get_effective_rights.acceptance_test:acceptance_test.py:105 Add an ACI granting add access to a user matching the userdn INFO  lib389:acceptance_test.py:115 dn: cn=template_person_objectclass,dc=example,dc=com INFO  lib389:acceptance_test.py:117 ######## entryLevelRights: b'vadn' INFO  lib389:acceptance_test.py:120 dn: cn=template_groupofnames_objectclass,dc=example,dc=com INFO  lib389:acceptance_test.py:122 ######## entryLevelRights: b'v'
Passed suites/gssapi/simple_gssapi_test.py::test_gssapi_bind 0.27
-----------------------------Captured stdout setup------------------------------
Kerberos master password: 4TP84zJXda6IVA.vKbJKU4bZHpT6.ImseRKRNR4OK.OqrDkRAnDE8X9yT1hW4Vwzr Loading random data Initializing database '/var/kerberos/krb5kdc/principal' for realm 'HOSTED.UPSHIFT.RDU2.REDHAT.COM', master key name 'K/M@HOSTED.UPSHIFT.RDU2.REDHAT.COM' Authenticating as principal root/admin@HOSTED.UPSHIFT.RDU2.REDHAT.COM with password. Principal "ldap/ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com@HOSTED.UPSHIFT.RDU2.REDHAT.COM" created. Authenticating as principal root/admin@HOSTED.UPSHIFT.RDU2.REDHAT.COM with password. K/M@HOSTED.UPSHIFT.RDU2.REDHAT.COM kadmin/admin@HOSTED.UPSHIFT.RDU2.REDHAT.COM kadmin/changepw@HOSTED.UPSHIFT.RDU2.REDHAT.COM kadmin/ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com@HOSTED.UPSHIFT.RDU2.REDHAT.COM kiprop/ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com@HOSTED.UPSHIFT.RDU2.REDHAT.COM krbtgt/HOSTED.UPSHIFT.RDU2.REDHAT.COM@HOSTED.UPSHIFT.RDU2.REDHAT.COM ldap/ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com@HOSTED.UPSHIFT.RDU2.REDHAT.COM Authenticating as principal root/admin@HOSTED.UPSHIFT.RDU2.REDHAT.COM with password. Entry for principal ldap/ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com@HOSTED.UPSHIFT.RDU2.REDHAT.COM with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/etc/krb5.keytab. Entry for principal ldap/ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com@HOSTED.UPSHIFT.RDU2.REDHAT.COM with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/etc/krb5.keytab. Authenticating as principal root/admin@HOSTED.UPSHIFT.RDU2.REDHAT.COM with password. Principal "testuser@HOSTED.UPSHIFT.RDU2.REDHAT.COM" created. Authenticating as principal root/admin@HOSTED.UPSHIFT.RDU2.REDHAT.COM with password. K/M@HOSTED.UPSHIFT.RDU2.REDHAT.COM kadmin/admin@HOSTED.UPSHIFT.RDU2.REDHAT.COM kadmin/changepw@HOSTED.UPSHIFT.RDU2.REDHAT.COM kadmin/ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com@HOSTED.UPSHIFT.RDU2.REDHAT.COM kiprop/ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com@HOSTED.UPSHIFT.RDU2.REDHAT.COM krbtgt/HOSTED.UPSHIFT.RDU2.REDHAT.COM@HOSTED.UPSHIFT.RDU2.REDHAT.COM ldap/ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com@HOSTED.UPSHIFT.RDU2.REDHAT.COM testuser@HOSTED.UPSHIFT.RDU2.REDHAT.COM Authenticating as principal root/admin@HOSTED.UPSHIFT.RDU2.REDHAT.COM with password. Entry for principal testuser@HOSTED.UPSHIFT.RDU2.REDHAT.COM with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/testuser.keytab. Entry for principal testuser@HOSTED.UPSHIFT.RDU2.REDHAT.COM with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/testuser.keytab.
-----------------------------Captured stderr setup------------------------------
No policy specified for ldap/ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com@HOSTED.UPSHIFT.RDU2.REDHAT.COM; defaulting to no policy No policy specified for testuser@HOSTED.UPSHIFT.RDU2.REDHAT.COM; defaulting to no policy
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/gssapi/simple_gssapi_test.py::test_invalid_sasl_map 0.53
No log output captured.
Passed suites/gssapi/simple_gssapi_test.py::test_missing_user 1.57
------------------------------Captured stdout call------------------------------
Authenticating as principal testuser/admin@HOSTED.UPSHIFT.RDU2.REDHAT.COM with password. Principal "doesnotexist@HOSTED.UPSHIFT.RDU2.REDHAT.COM" created. Authenticating as principal testuser/admin@HOSTED.UPSHIFT.RDU2.REDHAT.COM with password. K/M@HOSTED.UPSHIFT.RDU2.REDHAT.COM doesnotexist@HOSTED.UPSHIFT.RDU2.REDHAT.COM kadmin/admin@HOSTED.UPSHIFT.RDU2.REDHAT.COM kadmin/changepw@HOSTED.UPSHIFT.RDU2.REDHAT.COM kadmin/ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com@HOSTED.UPSHIFT.RDU2.REDHAT.COM kiprop/ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com@HOSTED.UPSHIFT.RDU2.REDHAT.COM krbtgt/HOSTED.UPSHIFT.RDU2.REDHAT.COM@HOSTED.UPSHIFT.RDU2.REDHAT.COM ldap/ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com@HOSTED.UPSHIFT.RDU2.REDHAT.COM testuser@HOSTED.UPSHIFT.RDU2.REDHAT.COM Authenticating as principal testuser/admin@HOSTED.UPSHIFT.RDU2.REDHAT.COM with password. Entry for principal doesnotexist@HOSTED.UPSHIFT.RDU2.REDHAT.COM with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/doesnotexist.keytab. Entry for principal doesnotexist@HOSTED.UPSHIFT.RDU2.REDHAT.COM with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/doesnotexist.keytab.
------------------------------Captured stderr call------------------------------
No policy specified for doesnotexist@HOSTED.UPSHIFT.RDU2.REDHAT.COM; defaulting to no policy
Passed suites/gssapi/simple_gssapi_test.py::test_support_mech 0.30
No log output captured.
Passed suites/gssapi/simple_gssapi_test.py::test_rejected_mech 0.31
No log output captured.
Passed suites/import/import_test.py::test_import_with_index 6.83
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/import/import_test.py::test_crash_on_ldif2db 6.22
No log output captured.
Passed suites/import/import_test.py::test_ldif2db_allows_entries_without_a_parent_to_be_imported 5.62
-------------------------------Captured log call--------------------------------
ERROR  lib389:__init__.py:2706 ldif2db: Can't find file: /var/lib/dirsrv/slapd-standalone1/ldif/bogus.ldif
Passed suites/import/import_test.py::test_issue_a_warning_if_the_cache_size_is_smaller 7.11
No log output captured.
Passed suites/import/import_test.py::test_fast_slow_import 24.09
No log output captured.
Passed suites/import/import_test.py::test_entry_with_escaped_characters_fails_to_import_and_index 14.05
------------------------------Captured stderr call------------------------------
[20/Oct/2020:19:56:55.025349384 -0400] - INFO - ldbm_instance_config_cachememsize_set - force a minimal value 512000 [20/Oct/2020:19:56:55.034638809 -0400] - INFO - check_and_set_import_cache - pagesize: 4096, available bytes 7462834176, process usage 23007232 [20/Oct/2020:19:56:55.041498832 -0400] - INFO - check_and_set_import_cache - Import allocates 2915169KB import cache. [20/Oct/2020:19:56:55.045696813 -0400] - INFO - bdb_copy_directory - Backing up file 0 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:56:54.908150/userRoot/givenName.db) [20/Oct/2020:19:56:55.051901927 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/givenName.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:56:54.908150/userRoot/givenName.db [20/Oct/2020:19:56:55.055568924 -0400] - INFO - bdb_copy_directory - Backing up file 1 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:56:54.908150/userRoot/aci.db) [20/Oct/2020:19:56:55.059140666 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/aci.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:56:54.908150/userRoot/aci.db [20/Oct/2020:19:56:55.062952617 -0400] - INFO - bdb_copy_directory - Backing up file 2 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:56:54.908150/userRoot/sn.db) [20/Oct/2020:19:56:55.066449093 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/sn.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:56:54.908150/userRoot/sn.db [20/Oct/2020:19:56:55.069691933 -0400] - INFO - bdb_copy_directory - Backing up file 3 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:56:54.908150/userRoot/numsubordinates.db) [20/Oct/2020:19:56:55.073367069 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/numsubordinates.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:56:54.908150/userRoot/numsubordinates.db [20/Oct/2020:19:56:55.076836303 -0400] - INFO - bdb_copy_directory - Backing up file 4 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:56:54.908150/userRoot/nsuniqueid.db) [20/Oct/2020:19:56:55.085117927 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/nsuniqueid.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:56:54.908150/userRoot/nsuniqueid.db [20/Oct/2020:19:56:55.088589730 -0400] - INFO - bdb_copy_directory - Backing up file 5 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:56:54.908150/userRoot/ancestorid.db) [20/Oct/2020:19:56:55.091704298 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/ancestorid.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:56:54.908150/userRoot/ancestorid.db [20/Oct/2020:19:56:55.094626913 -0400] - INFO - bdb_copy_directory - Backing up file 6 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:56:54.908150/userRoot/parentid.db) [20/Oct/2020:19:56:55.097991669 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/parentid.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:56:54.908150/userRoot/parentid.db [20/Oct/2020:19:56:55.104814952 -0400] - INFO - bdb_copy_directory - Backing up file 7 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:56:54.908150/userRoot/mail.db) [20/Oct/2020:19:56:55.108905697 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/mail.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:56:54.908150/userRoot/mail.db [20/Oct/2020:19:56:55.112637648 -0400] - INFO - bdb_copy_directory - Backing up file 8 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:56:54.908150/userRoot/telephoneNumber.db) [20/Oct/2020:19:56:55.116120714 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/telephoneNumber.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:56:54.908150/userRoot/telephoneNumber.db [20/Oct/2020:19:56:55.120703982 -0400] - INFO - bdb_copy_directory - Backing up file 9 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:56:54.908150/userRoot/cn.db) [20/Oct/2020:19:56:55.124095482 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/cn.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:56:54.908150/userRoot/cn.db [20/Oct/2020:19:56:55.127917388 -0400] - INFO - bdb_copy_directory - Backing up file 10 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:56:54.908150/userRoot/roomNumber.db) [20/Oct/2020:19:56:55.132701467 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/roomNumber.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:56:54.908150/userRoot/roomNumber.db [20/Oct/2020:19:56:55.136143274 -0400] - INFO - bdb_copy_directory - Backing up file 11 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:56:54.908150/userRoot/DBVERSION) [20/Oct/2020:19:56:55.139309250 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/DBVERSION to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:56:54.908150/userRoot/DBVERSION [20/Oct/2020:19:56:55.142716307 -0400] - INFO - bdb_copy_directory - Backing up file 12 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:56:54.908150/userRoot/uid.db) [20/Oct/2020:19:56:55.146054618 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/uid.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:56:54.908150/userRoot/uid.db [20/Oct/2020:19:56:55.149359133 -0400] - INFO - bdb_copy_directory - Backing up file 13 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:56:54.908150/userRoot/entryrdn.db) [20/Oct/2020:19:56:55.152326114 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/entryrdn.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:56:54.908150/userRoot/entryrdn.db [20/Oct/2020:19:56:55.155651578 -0400] - INFO - bdb_copy_directory - Backing up file 14 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:56:54.908150/userRoot/id2entry.db) [20/Oct/2020:19:56:55.159481764 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/id2entry.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:56:54.908150/userRoot/id2entry.db [20/Oct/2020:19:56:55.162923141 -0400] - INFO - bdb_copy_directory - Backing up file 15 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:56:54.908150/userRoot/objectclass.db) [20/Oct/2020:19:56:55.166001095 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/objectclass.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:56:54.908150/userRoot/objectclass.db [20/Oct/2020:19:56:55.170051323 -0400] - INFO - upgradedb_core - userRoot: Start upgradedb. [20/Oct/2020:19:56:55.173551833 -0400] - INFO - check_and_set_import_cache - pagesize: 4096, available bytes 7462494208, process usage 23990272 [20/Oct/2020:19:56:55.178154566 -0400] - INFO - check_and_set_import_cache - Import allocates 2915036KB import cache. [20/Oct/2020:19:56:55.810723111 -0400] - INFO - bdb_import_main - reindex userRoot: Index buffering enabled with bucket size 100 [20/Oct/2020:19:56:56.517003785 -0400] - INFO - import_monitor_threads - reindex userRoot: Workers finished; cleaning up... [20/Oct/2020:19:56:56.721969138 -0400] - INFO - import_monitor_threads - reindex userRoot: Workers cleaned up. [20/Oct/2020:19:56:56.725849701 -0400] - INFO - bdb_import_main - reindex userRoot: Cleaning up producer thread... [20/Oct/2020:19:56:56.729286214 -0400] - INFO - bdb_import_main - reindex userRoot: Indexing complete. Post-processing... [20/Oct/2020:19:56:56.732840548 -0400] - INFO - bdb_import_main - reindex userRoot: Generating numsubordinates (this may take several minutes to complete)... [20/Oct/2020:19:56:56.736021171 -0400] - INFO - bdb_import_main - reindex userRoot: Generating numSubordinates complete. [20/Oct/2020:19:56:56.738901488 -0400] - INFO - bdb_get_nonleaf_ids - reindex userRoot: Gathering ancestorid non-leaf IDs... [20/Oct/2020:19:56:56.742705443 -0400] - INFO - bdb_get_nonleaf_ids - reindex userRoot: Finished gathering ancestorid non-leaf IDs. [20/Oct/2020:19:56:56.745470961 -0400] - INFO - ldbm_get_nonleaf_ids - reindex userRoot: Starting sort of ancestorid non-leaf IDs... [20/Oct/2020:19:56:56.748141274 -0400] - INFO - ldbm_get_nonleaf_ids - reindex userRoot: Finished sort of ancestorid non-leaf IDs. [20/Oct/2020:19:56:56.755103745 -0400] - INFO - bdb_ancestorid_new_idl_create_index - reindex userRoot: Creating ancestorid index (new idl)... [20/Oct/2020:19:56:56.758569338 -0400] - INFO - bdb_ancestorid_new_idl_create_index - reindex userRoot: Created ancestorid index (new idl). [20/Oct/2020:19:56:56.761564622 -0400] - INFO - bdb_import_main - reindex userRoot: Flushing caches... [20/Oct/2020:19:56:56.764344049 -0400] - INFO - bdb_import_main - reindex userRoot: Closing files... [20/Oct/2020:19:56:56.874797351 -0400] - INFO - bdb_import_main - reindex userRoot: Reindexing complete. Processed 15 entries in 1 seconds. (15.00 entries/sec) [20/Oct/2020:19:56:56.879512429 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/log.0000000001 to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:56:54.908150/log.0000000001 [20/Oct/2020:19:56:56.896140483 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/DBVERSION to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-20T19:56:54.908150/DBVERSION [20/Oct/2020:19:56:56.901097540 -0400] - INFO - bdb_pre_close - All database threads now stopped
Passed suites/import/regression_test.py::test_replay_import_operation 32.43
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.import.regression_test:regression_test.py:94 Exporting LDIF online... INFO  tests.suites.import.regression_test:regression_test.py:53 Run. INFO  tests.suites.import.regression_test:regression_test.py:104 Importing LDIF online, should raise operation error. INFO  tests.suites.import.regression_test:regression_test.py:115 Looping. Tried 1 times so far. INFO  tests.suites.import.regression_test:regression_test.py:115 Looping. Tried 2 times so far. INFO  tests.suites.import.regression_test:regression_test.py:115 Looping. Tried 3 times so far. INFO  tests.suites.import.regression_test:regression_test.py:115 Looping. Tried 4 times so far. INFO  tests.suites.import.regression_test:regression_test.py:115 Looping. Tried 5 times so far. INFO  tests.suites.import.regression_test:regression_test.py:115 Looping. Tried 6 times so far. INFO  tests.suites.import.regression_test:regression_test.py:115 Looping. Tried 7 times so far. INFO  tests.suites.import.regression_test:regression_test.py:57 Adding users. INFO  tests.suites.import.regression_test:regression_test.py:119 Importing LDIF online
Passed suites/import/regression_test.py::test_import_be_default 11.33
-------------------------------Captured log call--------------------------------
INFO  tests.suites.import.regression_test:regression_test.py:143 Adding suffix:dc=default,dc=com and backend: default... INFO  tests.suites.import.regression_test:regression_test.py:149 Create LDIF file and import it... INFO  tests.suites.import.regression_test:regression_test.py:154 Stopping the server and running offline import... INFO  tests.suites.import.regression_test:regression_test.py:160 Verifying entry count after import... INFO  tests.suites.import.regression_test:regression_test.py:166 Test PASSED
Passed suites/import/regression_test.py::test_del_suffix_import 6.67
-------------------------------Captured log call--------------------------------
INFO  tests.suites.import.regression_test:regression_test.py:183 Adding suffix:dc=importest1,dc=com and backend: importest1 INFO  tests.suites.import.regression_test:regression_test.py:188 Create LDIF file and import it INFO  tests.suites.import.regression_test:regression_test.py:194 Stopping the server and running offline import INFO  tests.suites.import.regression_test:regression_test.py:199 Deleting suffix-dc=importest2,dc=com INFO  tests.suites.import.regression_test:regression_test.py:202 Adding the same database-importest1 after deleting it
Passed suites/import/regression_test.py::test_del_suffix_backend 6.92
-------------------------------Captured log call--------------------------------
INFO  tests.suites.import.regression_test:regression_test.py:221 Adding suffix:dc=importest2,dc=com and backend: importest2 INFO  tests.suites.import.regression_test:regression_test.py:226 Create LDIF file and import it INFO  lib389:tasks.py:498 Import task import_10202020_195759 for file /var/lib/dirsrv/slapd-standalone1/ldif/suffix_del2.ldif completed successfully INFO  tests.suites.import.regression_test:regression_test.py:234 Deleting suffix-dc=importest2,dc=com INFO  tests.suites.import.regression_test:regression_test.py:237 Adding the same database-importest2 after deleting it INFO  tests.suites.import.regression_test:regression_test.py:240 Checking if server can be restarted after re-adding the same database
Passed suites/import/regression_test.py::test_import_duplicate_dn 15.19
-------------------------------Captured log call--------------------------------
INFO  tests.suites.import.regression_test:regression_test.py:266 Delete the previous error logs INFO  tests.suites.import.regression_test:regression_test.py:269 Create import file INFO  tests.suites.import.regression_test:regression_test.py:292 Import ldif with duplicate entry ERROR  lib389:tasks.py:495 Error: import task import_10202020_195808 for file /var/lib/dirsrv/slapd-standalone1/ldif/data.ldif exited with -23 INFO  tests.suites.import.regression_test:regression_test.py:295 Restart the server to flush the logs INFO  tests.suites.import.regression_test:regression_test.py:298 Error log should not have "unable to flush" message INFO  tests.suites.import.regression_test:regression_test.py:301 Error log should have "Duplicated DN detected" message
Passed suites/import/regression_test.py::test_large_ldif2db_ancestorid_index_creation 625.98
-------------------------------Captured log call--------------------------------
INFO  tests.suites.import.regression_test:regression_test.py:354 Delete the previous errors logs INFO  tests.suites.import.regression_test:regression_test.py:357 Add suffix:o=test and backend: test... INFO  tests.suites.import.regression_test:regression_test.py:370 Create a large nested ldif file using dbgen : /var/lib/dirsrv/slapd-standalone1/ldif/large_nested.ldif INFO  tests.suites.import.regression_test:regression_test.py:373 Stop the server and run offline import... INFO  tests.suites.import.regression_test:regression_test.py:378 Starting the server INFO  tests.suites.import.regression_test:regression_test.py:381 parse the errors logs to check lines with "Starting sort of ancestorid" are present INFO  tests.suites.import.regression_test:regression_test.py:385 parse the errors logs to check lines with "Finished sort of ancestorid" are present INFO  tests.suites.import.regression_test:regression_test.py:389 parse the error logs for the line with "Gathering ancestorid non-leaf IDs" INFO  tests.suites.import.regression_test:regression_test.py:393 parse the error logs for the line with "Created ancestorid index" INFO  tests.suites.import.regression_test:regression_test.py:397 get the ancestorid non-leaf IDs indexing start and end time from the collected strings INFO  tests.suites.import.regression_test:regression_test.py:403 Calculate the elapsed time for the ancestorid non-leaf IDs index creation
Passed suites/lib389/config_compare_test.py::test_config_compare 0.01
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38902, 'ldap-secureport': 63602, 'server-id': 'standalone2', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/lib389/dsldapobject/dn_construct_test.py::test_mul_explicit_rdn 0.02
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/lib389/dsldapobject/dn_construct_test.py::test_mul_derive_single_dn 0.26
No log output captured.
Passed suites/lib389/dsldapobject/dn_construct_test.py::test_mul_derive_mult_dn 0.01
No log output captured.
Passed suites/lib389/dsldapobject/dn_construct_test.py::test_sin_explicit_dn 0.01
No log output captured.
Passed suites/lib389/dsldapobject/dn_construct_test.py::test_sin_explicit_rdn 0.26
No log output captured.
Passed suites/lib389/dsldapobject/dn_construct_test.py::test_sin_derive_single_dn 0.01
No log output captured.
Passed suites/lib389/dsldapobject/dn_construct_test.py::test_sin_derive_mult_dn 0.01
No log output captured.
Passed suites/lib389/dsldapobject/dn_construct_test.py::test_sin_invalid_no_basedn 0.00
No log output captured.
Passed suites/lib389/dsldapobject/dn_construct_test.py::test_sin_invalid_no_rdn 0.00
No log output captured.
Passed suites/lib389/dsldapobject/dn_construct_test.py::test_sin_non_present_rdn 0.02
No log output captured.
Passed suites/lib389/idm/user_compare_i2_test.py::test_user_compare_i2 1.29
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38902, 'ldap-secureport': 63602, 'server-id': 'standalone2', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/lib389/idm/user_compare_st_test.py::test_user_compare 0.07
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/logging/logging_config_test.py::test_logging_digit_config[logexpirationtime-invalid_vals0-valid_vals0] 0.59
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/logging/logging_config_test.py::test_logging_digit_config[maxlogsize-invalid_vals1-valid_vals1] 0.11
No log output captured.
Passed suites/logging/logging_config_test.py::test_logging_digit_config[logmaxdiskspace-invalid_vals2-valid_vals2] 0.09
No log output captured.
Passed suites/logging/logging_config_test.py::test_logging_digit_config[logminfreediskspace-invalid_vals3-valid_vals3] 0.09
No log output captured.
Passed suites/logging/logging_config_test.py::test_logging_digit_config[mode-invalid_vals4-valid_vals4] 0.38
No log output captured.
Passed suites/logging/logging_config_test.py::test_logging_digit_config[maxlogsperdir-invalid_vals5-valid_vals5] 0.10
No log output captured.
Passed suites/logging/logging_config_test.py::test_logging_digit_config[logrotationsynchour-invalid_vals6-valid_vals6] 0.09
No log output captured.
Passed suites/logging/logging_config_test.py::test_logging_digit_config[logrotationsyncmin-invalid_vals7-valid_vals7] 0.10
No log output captured.
Passed suites/logging/logging_config_test.py::test_logging_digit_config[logrotationtime-invalid_vals8-valid_vals8] 0.09
No log output captured.
Passed suites/mapping_tree/be_del_and_default_naming_attr_test.py::test_be_delete 0.50
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology.
Passed suites/memberof_plugin/regression_test.py::test_memberof_group 4.91
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.memberof_plugin.regression_test:regression_test.py:480 Enable memberof plugin and set the scope as cn=sub1,dc=example,dc=com INFO  lib389:regression_test.py:439 !!!!!!! uid=test_m1,cn=sub1,dc=example,dc=com: memberof->b'cn=g1,cn=sub1,dc=example,dc=com' INFO  lib389:regression_test.py:439 !!!!!!! uid=test_m2,cn=sub1,dc=example,dc=com: memberof->b'cn=g1,cn=sub1,dc=example,dc=com' INFO  lib389:regression_test.py:439 !!!!!!! uid=test_m1,cn=sub1,dc=example,dc=com: memberof->b'cn=g1,cn=sub1,dc=example,dc=com' INFO  lib389:regression_test.py:439 !!!!!!! uid=test_m2,cn=sub1,dc=example,dc=com: memberof->b'cn=g1,cn=sub1,dc=example,dc=com' CRITICAL tests.suites.memberof_plugin.regression_test:regression_test.py:426 Renaming user (cn=g2,cn=sub2,dc=example,dc=com): new cn=g2-new INFO  lib389:regression_test.py:439 !!!!!!! uid=test_m1,cn=sub1,dc=example,dc=com: memberof->b'cn=g1,cn=sub1,dc=example,dc=com' INFO  lib389:regression_test.py:439 !!!!!!! uid=test_m2,cn=sub1,dc=example,dc=com: memberof->b'cn=g1,cn=sub1,dc=example,dc=com' INFO  lib389:regression_test.py:439 !!!!!!! uid=test_m1,cn=sub1,dc=example,dc=com: memberof->b'cn=g1,cn=sub1,dc=example,dc=com' INFO  lib389:regression_test.py:439 !!!!!!! uid=test_m1,cn=sub1,dc=example,dc=com: memberof->b'cn=g2-new,cn=sub1,dc=example,dc=com' INFO  lib389:regression_test.py:439 !!!!!!! uid=test_m2,cn=sub1,dc=example,dc=com: memberof->b'cn=g1,cn=sub1,dc=example,dc=com' INFO  lib389:regression_test.py:439 !!!!!!! uid=test_m2,cn=sub1,dc=example,dc=com: memberof->b'cn=g2-new,cn=sub1,dc=example,dc=com'
Passed suites/memberof_plugin/regression_test.py::test_entrycache_on_modrdn_failure 9.05
-------------------------------Captured log call--------------------------------
CRITICAL tests.suites.memberof_plugin.regression_test:regression_test.py:574 Adding user (cn=user0,ou=people,dc=example,dc=com): CRITICAL tests.suites.memberof_plugin.regression_test:regression_test.py:574 Adding user (cn=user1,ou=people,dc=example,dc=com): CRITICAL tests.suites.memberof_plugin.regression_test:regression_test.py:574 Adding user (cn=user2,ou=people,dc=example,dc=com): CRITICAL tests.suites.memberof_plugin.regression_test:regression_test.py:574 Adding user (cn=user3,ou=people,dc=example,dc=com): CRITICAL tests.suites.memberof_plugin.regression_test:regression_test.py:574 Adding user (cn=user4,ou=people,dc=example,dc=com): CRITICAL tests.suites.memberof_plugin.regression_test:regression_test.py:574 Adding user (cn=user5,ou=people,dc=example,dc=com): CRITICAL tests.suites.memberof_plugin.regression_test:regression_test.py:574 Adding user (cn=user6,ou=people,dc=example,dc=com): CRITICAL tests.suites.memberof_plugin.regression_test:regression_test.py:574 Adding user (cn=user7,ou=people,dc=example,dc=com): CRITICAL tests.suites.memberof_plugin.regression_test:regression_test.py:574 Adding user (cn=user8,ou=people,dc=example,dc=com): CRITICAL tests.suites.memberof_plugin.regression_test:regression_test.py:574 Adding user (cn=user9,ou=people,dc=example,dc=com): INFO  lib389:regression_test.py:595 !!!!!!! cn=user0,ou=people,dc=example,dc=com: memberof->b'cn=group_in0,ou=People,dc=example,dc=com' (vs b'cn=group_in0,ou=people,dc=example,dc=com') INFO  lib389:regression_test.py:595 !!!!!!! cn=user1,ou=people,dc=example,dc=com: memberof->b'cn=group_in0,ou=People,dc=example,dc=com' (vs b'cn=group_in0,ou=people,dc=example,dc=com') INFO  lib389:regression_test.py:617 !!!!!!! cn=user0,ou=people,dc=example,dc=com: memberof->b'cn=group_in0,ou=People,dc=example,dc=com' (vs b'cn=group_out1,dc=example,dc=com') INFO  lib389:regression_test.py:617 !!!!!!! cn=user1,ou=people,dc=example,dc=com: memberof->b'cn=group_in0,ou=People,dc=example,dc=com' (vs b'cn=group_out1,dc=example,dc=com') INFO  lib389:regression_test.py:632 !!!!!!! cn=user0,ou=people,dc=example,dc=com: memberof->b'cn=group_in0,ou=People,dc=example,dc=com' (vs b'cn=group_in1,ou=people,dc=example,dc=com') INFO  lib389:regression_test.py:632 !!!!!!! cn=user0,ou=people,dc=example,dc=com: memberof->b'cn=group_in1,ou=People,dc=example,dc=com' (vs b'cn=group_in1,ou=people,dc=example,dc=com') INFO  lib389:regression_test.py:632 !!!!!!! cn=user1,ou=people,dc=example,dc=com: memberof->b'cn=group_in0,ou=People,dc=example,dc=com' (vs b'cn=group_in1,ou=people,dc=example,dc=com') INFO  lib389:regression_test.py:632 !!!!!!! cn=user1,ou=people,dc=example,dc=com: memberof->b'cn=group_in1,ou=People,dc=example,dc=com' (vs b'cn=group_in1,ou=people,dc=example,dc=com') INFO  lib389:regression_test.py:671 retrieve: cn=group_in0,ou=People,dc=example,dc=com with desc=b'mygroup' INFO  lib389:regression_test.py:671 retrieve: cn=group_in1,ou=People,dc=example,dc=com with desc=b'mygroup' INFO  lib389:regression_test.py:671 retrieve: cn=group_out2,dc=example,dc=com with desc=b'this is to check that the entry having this description has the appropriate DN'
Passed suites/memberof_plugin/regression_test.py::test_silent_memberof_failure 9.14
-------------------------------Captured log call--------------------------------
CRITICAL tests.suites.memberof_plugin.regression_test:regression_test.py:737 Adding user (cn=user0,ou=people,dc=example,dc=com): CRITICAL tests.suites.memberof_plugin.regression_test:regression_test.py:737 Adding user (cn=user1,ou=people,dc=example,dc=com): CRITICAL tests.suites.memberof_plugin.regression_test:regression_test.py:737 Adding user (cn=user2,ou=people,dc=example,dc=com): CRITICAL tests.suites.memberof_plugin.regression_test:regression_test.py:737 Adding user (cn=user3,ou=people,dc=example,dc=com): CRITICAL tests.suites.memberof_plugin.regression_test:regression_test.py:737 Adding user (cn=user4,ou=people,dc=example,dc=com): CRITICAL tests.suites.memberof_plugin.regression_test:regression_test.py:737 Adding user (cn=user5,ou=people,dc=example,dc=com): CRITICAL tests.suites.memberof_plugin.regression_test:regression_test.py:737 Adding user (cn=user6,ou=people,dc=example,dc=com): CRITICAL tests.suites.memberof_plugin.regression_test:regression_test.py:737 Adding user (cn=user7,ou=people,dc=example,dc=com): CRITICAL tests.suites.memberof_plugin.regression_test:regression_test.py:737 Adding user (cn=user8,ou=people,dc=example,dc=com): CRITICAL tests.suites.memberof_plugin.regression_test:regression_test.py:737 Adding user (cn=user9,ou=people,dc=example,dc=com): INFO  lib389:regression_test.py:758 !!!!!!! cn=user0,ou=people,dc=example,dc=com: memberof->b'cn=group_in0,ou=People,dc=example,dc=com' (vs b'cn=group_in0,ou=people,dc=example,dc=com') INFO  lib389:regression_test.py:758 !!!!!!! cn=user1,ou=people,dc=example,dc=com: memberof->b'cn=group_in0,ou=People,dc=example,dc=com' (vs b'cn=group_in0,ou=people,dc=example,dc=com') INFO  lib389:regression_test.py:780 !!!!!!! cn=user0,ou=people,dc=example,dc=com: memberof->b'cn=group_in0,ou=People,dc=example,dc=com' (vs b'cn=group_out1,dc=example,dc=com') INFO  lib389:regression_test.py:780 !!!!!!! cn=user1,ou=people,dc=example,dc=com: memberof->b'cn=group_in0,ou=People,dc=example,dc=com' (vs b'cn=group_out1,dc=example,dc=com') INFO  lib389:regression_test.py:795 !!!!!!! cn=user0,ou=people,dc=example,dc=com: memberof->b'cn=group_in0,ou=People,dc=example,dc=com' (vs b'cn=group_in1,ou=people,dc=example,dc=com') INFO  lib389:regression_test.py:795 !!!!!!! cn=user0,ou=people,dc=example,dc=com: memberof->b'cn=group_in1,ou=People,dc=example,dc=com' (vs b'cn=group_in1,ou=people,dc=example,dc=com') INFO  lib389:regression_test.py:795 !!!!!!! cn=user1,ou=people,dc=example,dc=com: memberof->b'cn=group_in0,ou=People,dc=example,dc=com' (vs b'cn=group_in1,ou=people,dc=example,dc=com') INFO  lib389:regression_test.py:795 !!!!!!! cn=user1,ou=people,dc=example,dc=com: memberof->b'cn=group_in1,ou=People,dc=example,dc=com' (vs b'cn=group_in1,ou=people,dc=example,dc=com') INFO  lib389:regression_test.py:832 Should assert cn=user2,ou=people,dc=example,dc=com has memberof is False INFO  lib389:regression_test.py:832 Should assert cn=user3,ou=people,dc=example,dc=com has memberof is False INFO  lib389:regression_test.py:855 Should assert cn=user4,ou=people,dc=example,dc=com has memberof is False INFO  lib389:regression_test.py:855 Should assert cn=user5,ou=people,dc=example,dc=com has memberof is False
Passed suites/monitor/monitor_test.py::test_monitor 0.37
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.monitor.monitor_test:monitor_test.py:44 connection: ['1:20201021001311Z:3:2:-:cn=directory manager:0:0:0:1:ip=127.0.0.1'], currentconnections: ['1'], totalconnections: ['1'] INFO  tests.suites.monitor.monitor_test:monitor_test.py:48 version :: ['1:20201021001311Z:6:5:-:cn=directory manager:0:0:0:1:ip=127.0.0.1'] INFO  tests.suites.monitor.monitor_test:monitor_test.py:52 threads: ['16'],currentconnectionsatmaxthreads: ['0'],maxthreadsperconnhits: ['0'] INFO  tests.suites.monitor.monitor_test:monitor_test.py:56 nbackends: ['1'], backendmonitordn: ['cn=monitor,cn=userRoot,cn=ldbm database,cn=plugins,cn=config'] INFO  tests.suites.monitor.monitor_test:monitor_test.py:60 opsinitiated: ['12'], opscompleted: ['13'] INFO  tests.suites.monitor.monitor_test:monitor_test.py:64 dtablesize: ['1024'],readwaiters: ['0'],entriessent: ['14'],bytessent: ['1100'],currenttime: ['20201021001311Z'],starttime: ['20201021001311Z']
Passed suites/monitor/monitor_test.py::test_monitor_ldbm 0.01
No log output captured.
Passed suites/monitor/monitor_test.py::test_monitor_backend 0.01
No log output captured.
Passed suites/openldap_2_389/migrate_test.py::test_parse_openldap_slapdd 0.01
-------------------------------Captured log call--------------------------------
INFO  lib389.migrate.openldap.config:config.py:264 Examining OpenLDAP Configuration ... INFO  lib389.migrate.openldap.config:config.py:285 Completed OpenLDAP Configuration Parsing.
Passed suites/openldap_2_389/migrate_test.py::test_migrate_openldap_slapdd 15.96
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
------------------------------Captured stdout call------------------------------
==== migration plan ==== SchemaAttributeCreate -> ('pseudonym',) SchemaAttributeCreate -> ('email', 'emailAddress', 'pkcs9email') SchemaAttributeCreate -> ('textEncodedORAddress',) SchemaAttributeUnsupported -> ('otherMailbox',) SchemaAttributeCreate -> ('aRecord',) SchemaAttributeCreate -> ('mDRecord',) SchemaAttributeCreate -> ('mXRecord',) SchemaAttributeCreate -> ('nSRecord',) SchemaAttributeCreate -> ('sOARecord',) SchemaAttributeCreate -> ('cNAMERecord',) SchemaAttributeCreate -> ('janetMailbox',) SchemaAttributeCreate -> ('mailPreferenceOption',) SchemaAttributeUnsupported -> ('dSAQuality',) SchemaAttributeUnsupported -> ('singleLevelQuality',) SchemaAttributeUnsupported -> ('subtreeMinimumQuality',) SchemaAttributeUnsupported -> ('subtreeMaximumQuality',) SchemaAttributeCreate -> ('personalSignature',) SchemaAttributeCreate -> ('suseDefaultBase',) SchemaAttributeCreate -> ('suseNextUniqueId',) SchemaAttributeCreate -> ('suseMinUniqueId',) SchemaAttributeCreate -> ('suseMaxUniqueId',) SchemaAttributeCreate -> ('suseDefaultTemplate',) SchemaAttributeCreate -> ('suseSearchFilter',) SchemaAttributeCreate -> ('suseDefaultValue',) SchemaAttributeCreate -> ('suseNamingAttribute',) SchemaAttributeCreate -> ('suseSecondaryGroup',) SchemaAttributeCreate -> ('suseMinPasswordLength',) SchemaAttributeCreate -> ('suseMaxPasswordLength',) SchemaAttributeCreate -> ('susePasswordHash',) SchemaAttributeCreate -> ('suseSkelDir',) SchemaAttributeCreate -> ('susePlugin',) SchemaAttributeCreate -> ('suseMapAttribute',) SchemaAttributeCreate -> ('suseImapServer',) SchemaAttributeCreate -> ('suseImapAdmin',) SchemaAttributeCreate -> ('suseImapDefaultQuota',) SchemaAttributeCreate -> ('suseImapUseSsl',) SchemaClassCreate -> 1.3.6.1.4.1.250.3.15 ('labeledURIObject',) may -> ('labeledURI',) must -> () sup -> ('top',) SchemaClassUnsupported -> 0.9.2342.19200300.100.4.4 ('pilotPerson', 'newPilotPerson') may -> ('userid', 'textEncodedORAddress', 'rfc822Mailbox', 'favouriteDrink', 'roomNumber', 'userClass', 'homeTelephoneNumber', 'homePostalAddress', 'secretary', 'personalTitle', 'preferredDeliveryMethod', 'businessCategory', 'janetMailbox', 'otherMailbox', 'mobileTelephoneNumber', 'pagerTelephoneNumber', 'organizationalStatus', 'mailPreferenceOption', 'personalSignature') must -> () sup -> ('person',) SchemaClassCreate -> 0.9.2342.19200300.100.4.15 ('dNSDomain',) may -> ('ARecord', 'MDRecord', 'MXRecord', 'NSRecord', 'SOARecord', 'CNAMERecord') must -> () sup -> ('domain',) SchemaClassCreate -> 0.9.2342.19200300.100.4.20 ('pilotOrganization',) may -> ('buildingName',) must -> () sup -> ('organization', 'organizationalUnit') SchemaClassUnsupported -> 0.9.2342.19200300.100.4.21 ('pilotDSA',) may -> ('dSAQuality',) must -> () sup -> ('dsa',) SchemaClassUnsupported -> 0.9.2342.19200300.100.4.22 ('qualityLabelledData',) may -> ('subtreeMinimumQuality', 'subtreeMaximumQuality') must -> ('dsaQuality',) sup -> ('top',) SchemaClassCreate -> 1.3.6.1.4.1.5322.13.1.1 ('namedObject',) may -> ('cn',) must -> () sup -> ('top',) SchemaClassCreate -> SUSE.YaST.ModuleConfig.OC:2 ('suseModuleConfiguration',) may -> ('suseDefaultBase',) must -> ('cn',) sup -> ('top',) SchemaClassCreate -> SUSE.YaST.ModuleConfig.OC:3 ('suseUserConfiguration',) may -> ('suseMinPasswordLength', 'suseMaxPasswordLength', 'susePasswordHash', 'suseSkelDir', 'suseNextUniqueId', 'suseMinUniqueId', 'suseMaxUniqueId', 'suseDefaultTemplate', 'suseSearchFilter', 'suseMapAttribute') must -> () sup -> ('suseModuleConfiguration',) SchemaClassCreate -> SUSE.YaST.ModuleConfig.OC:4 ('suseObjectTemplate',) may -> ('susePlugin', 'suseDefaultValue', 'suseNamingAttribute') must -> ('cn',) sup -> ('top',) SchemaClassCreate -> SUSE.YaST.ModuleConfig.OC:5 ('suseUserTemplate',) may -> ('suseSecondaryGroup',) must -> ('cn',) sup -> ('suseObjectTemplate',) SchemaClassCreate -> SUSE.YaST.ModuleConfig.OC:6 ('suseGroupTemplate',) may -> () must -> ('cn',) sup -> ('suseObjectTemplate',) SchemaClassCreate -> SUSE.YaST.ModuleConfig.OC:7 ('suseGroupConfiguration',) may -> ('suseNextUniqueId', 'suseMinUniqueId', 'suseMaxUniqueId', 'suseDefaultTemplate', 'suseSearchFilter', 'suseMapAttribute') must -> () sup -> ('suseModuleConfiguration',) SchemaClassCreate -> SUSE.YaST.ModuleConfig.OC:8 ('suseCaConfiguration',) may -> () must -> () sup -> ('suseModuleConfiguration',) SchemaClassCreate -> SUSE.YaST.ModuleConfig.OC:9 ('suseDnsConfiguration',) may -> () must -> () sup -> ('suseModuleConfiguration',) SchemaClassCreate -> SUSE.YaST.ModuleConfig.OC:10 ('suseDhcpConfiguration',) may -> () must -> () sup -> ('suseModuleConfiguration',) SchemaClassCreate -> SUSE.YaST.ModuleConfig.OC:11 ('suseMailConfiguration',) may -> () must -> ('suseImapServer', 'suseImapAdmin', 'suseImapDefaultQuota', 'suseImapUseSsl') sup -> ('suseModuleConfiguration',) DatabaseReindex -> dc=example,dc=com PluginMemberOfEnable PluginMemberOfScope -> dc=example,dc=com PluginMemberOfFixup -> dc=example,dc=com PluginRefintEnable PluginRefintAttributes -> member PluginRefintAttributes -> memberOf PluginRefintScope -> dc=example,dc=com PluginUniqueConfigure -> dc=example,dc=com, mail 401a528e-eaf5-1039-8667-dbfbf2f5e6dd PluginUniqueConfigure -> dc=example,dc=com, uid 401a528e-eaf5-1039-8667-dbfbf2f5e6dd DatabaseCreate -> dc=example,dc=net, 401a7084-eaf5-1039-866c-dbfbf2f5e6dd DatabaseIndexCreate -> objectClass eq, dc=example,dc=net DatabaseReindex -> dc=example,dc=net PluginMemberOfEnable PluginMemberOfScope -> dc=example,dc=net PluginMemberOfFixup -> dc=example,dc=net PluginUniqueConfigure -> dc=example,dc=net, mail 401a7084-eaf5-1039-866c-dbfbf2f5e6dd PluginUniqueConfigure -> dc=example,dc=net, uid 401a7084-eaf5-1039-866c-dbfbf2f5e6dd DatabaseLdifImport -> dc=example,dc=com /export/tests/suites/openldap_2_389/../../data/openldap_2_389/1/example_com.slapcat.ldif DatabaseLdifImport -> dc=example,dc=net /export/tests/suites/openldap_2_389/../../data/openldap_2_389/1/example_net.slapcat.ldif ==== end migration plan ====
-------------------------------Captured log call--------------------------------
INFO  lib389.migrate.openldap.config:config.py:264 Examining OpenLDAP Configuration ... INFO  lib389.migrate.openldap.config:config.py:285 Completed OpenLDAP Configuration Parsing. INFO  lib389.migrate.plan:plan.py:656 migration: 1 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 2 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 3 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 4 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 5 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 6 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 7 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 8 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 9 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 10 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 11 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 12 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 13 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 14 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 15 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 16 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 17 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 18 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 19 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 20 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 21 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 22 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 23 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 24 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 25 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 26 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 27 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 28 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 29 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 30 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 31 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 32 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 33 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 34 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 35 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 36 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 37 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 38 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 39 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 40 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 41 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 42 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 43 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 44 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 45 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 46 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 47 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 48 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 49 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 50 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 51 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 52 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 53 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 54 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 55 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 56 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 57 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 58 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 59 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 60 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 61 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 62 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 63 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 64 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 65 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 66 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 67 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 68 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 69 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 70 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 71 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 72 / 73 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 73 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 1 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 2 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 3 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 4 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 5 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 6 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 7 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 8 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 9 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 10 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 11 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 12 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 13 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 14 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 15 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 16 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 17 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 18 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 19 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 20 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 21 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 22 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 23 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 24 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 25 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 26 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 27 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 28 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 29 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 30 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 31 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 32 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 33 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 34 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 35 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 36 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 37 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 38 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 39 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 40 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 41 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 42 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 43 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 44 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 45 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 46 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 47 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 48 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 49 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 50 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 51 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 52 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 53 / 73 complete ... INFO  lib389:tasks.py:798 Index task index_all_10202020_201330 completed successfully INFO  lib389.migrate.plan:plan.py:663 post: 54 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 55 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 56 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 57 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 58 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 59 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 60 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 61 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 62 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 63 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 64 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 65 / 73 complete ... INFO  lib389:tasks.py:798 Index task index_all_10202020_201332 completed successfully INFO  lib389.migrate.plan:plan.py:663 post: 66 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 67 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 68 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 69 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 70 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 71 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 72 / 73 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 73 / 73 complete ...
Passed suites/openldap_2_389/migrate_test.py::test_migrate_openldap_slapdd_skip_elements 9.96
------------------------------Captured stdout call------------------------------
==== migration plan ==== SchemaAttributeUnsupported -> ('otherMailbox',) SchemaAttributeUnsupported -> ('dSAQuality',) SchemaAttributeUnsupported -> ('singleLevelQuality',) SchemaAttributeUnsupported -> ('subtreeMinimumQuality',) SchemaAttributeUnsupported -> ('subtreeMaximumQuality',) SchemaAttributeCreate -> ('suseDefaultBase',) SchemaAttributeCreate -> ('suseNextUniqueId',) SchemaAttributeCreate -> ('suseMinUniqueId',) SchemaAttributeCreate -> ('suseMaxUniqueId',) SchemaAttributeCreate -> ('suseDefaultTemplate',) SchemaAttributeCreate -> ('suseSearchFilter',) SchemaAttributeCreate -> ('suseDefaultValue',) SchemaAttributeCreate -> ('suseNamingAttribute',) SchemaAttributeCreate -> ('suseSecondaryGroup',) SchemaAttributeCreate -> ('suseMinPasswordLength',) SchemaAttributeCreate -> ('suseMaxPasswordLength',) SchemaAttributeCreate -> ('susePasswordHash',) SchemaAttributeCreate -> ('suseSkelDir',) SchemaAttributeCreate -> ('susePlugin',) SchemaAttributeCreate -> ('suseMapAttribute',) SchemaAttributeCreate -> ('suseImapServer',) SchemaAttributeCreate -> ('suseImapAdmin',) SchemaAttributeCreate -> ('suseImapDefaultQuota',) SchemaAttributeCreate -> ('suseImapUseSsl',) SchemaClassUnsupported -> 0.9.2342.19200300.100.4.4 ('pilotPerson', 'newPilotPerson') may -> ('userid', 'textEncodedORAddress', 'rfc822Mailbox', 'favouriteDrink', 'roomNumber', 'userClass', 'homeTelephoneNumber', 'homePostalAddress', 'secretary', 'personalTitle', 'preferredDeliveryMethod', 'businessCategory', 'janetMailbox', 'otherMailbox', 'mobileTelephoneNumber', 'pagerTelephoneNumber', 'organizationalStatus', 'mailPreferenceOption', 'personalSignature') must -> () sup -> ('person',) SchemaClassInconsistent -> ( 0.9.2342.19200300.100.4.20 NAME 'pilotOrganization' SUP organization STRUCTURAL MAY buildingName X-ORIGIN 'user defined' ) to 0.9.2342.19200300.100.4.20 ('pilotOrganization',) may -> ('buildingName',) must -> () sup -> ('organization', 'organizationalUnit') SchemaClassUnsupported -> 0.9.2342.19200300.100.4.21 ('pilotDSA',) may -> ('dSAQuality',) must -> () sup -> ('dsa',) SchemaClassUnsupported -> 0.9.2342.19200300.100.4.22 ('qualityLabelledData',) may -> ('subtreeMinimumQuality', 'subtreeMaximumQuality') must -> ('dsaQuality',) sup -> ('top',) SchemaClassCreate -> SUSE.YaST.ModuleConfig.OC:2 ('suseModuleConfiguration',) may -> ('suseDefaultBase',) must -> ('cn',) sup -> ('top',) SchemaClassCreate -> SUSE.YaST.ModuleConfig.OC:3 ('suseUserConfiguration',) may -> ('suseMinPasswordLength', 'suseMaxPasswordLength', 'susePasswordHash', 'suseSkelDir', 'suseNextUniqueId', 'suseMinUniqueId', 'suseMaxUniqueId', 'suseDefaultTemplate', 'suseSearchFilter', 'suseMapAttribute') must -> () sup -> ('suseModuleConfiguration',) SchemaClassCreate -> SUSE.YaST.ModuleConfig.OC:4 ('suseObjectTemplate',) may -> ('susePlugin', 'suseDefaultValue', 'suseNamingAttribute') must -> ('cn',) sup -> ('top',) SchemaClassCreate -> SUSE.YaST.ModuleConfig.OC:5 ('suseUserTemplate',) may -> ('suseSecondaryGroup',) must -> ('cn',) sup -> ('suseObjectTemplate',) SchemaClassCreate -> SUSE.YaST.ModuleConfig.OC:6 ('suseGroupTemplate',) may -> () must -> ('cn',) sup -> ('suseObjectTemplate',) SchemaClassCreate -> SUSE.YaST.ModuleConfig.OC:7 ('suseGroupConfiguration',) may -> ('suseNextUniqueId', 'suseMinUniqueId', 'suseMaxUniqueId', 'suseDefaultTemplate', 'suseSearchFilter', 'suseMapAttribute') must -> () sup -> ('suseModuleConfiguration',) SchemaClassCreate -> SUSE.YaST.ModuleConfig.OC:8 ('suseCaConfiguration',) may -> () must -> () sup -> ('suseModuleConfiguration',) SchemaClassCreate -> SUSE.YaST.ModuleConfig.OC:9 ('suseDnsConfiguration',) may -> () must -> () sup -> ('suseModuleConfiguration',) SchemaClassCreate -> SUSE.YaST.ModuleConfig.OC:10 ('suseDhcpConfiguration',) may -> () must -> () sup -> ('suseModuleConfiguration',) SchemaClassCreate -> SUSE.YaST.ModuleConfig.OC:11 ('suseMailConfiguration',) may -> () must -> ('suseImapServer', 'suseImapAdmin', 'suseImapDefaultQuota', 'suseImapUseSsl') sup -> ('suseModuleConfiguration',) DatabaseReindex -> dc=example,dc=com PluginMemberOfEnable PluginMemberOfScope -> dc=example,dc=com PluginMemberOfFixup -> dc=example,dc=com PluginRefintEnable PluginRefintAttributes -> member PluginRefintAttributes -> memberOf PluginRefintScope -> dc=example,dc=com PluginUniqueConfigure -> dc=example,dc=com, mail 401a528e-eaf5-1039-8667-dbfbf2f5e6dd PluginUniqueConfigure -> dc=example,dc=com, uid 401a528e-eaf5-1039-8667-dbfbf2f5e6dd DatabaseReindex -> dc=example,dc=net PluginMemberOfEnable PluginMemberOfScope -> dc=example,dc=net PluginMemberOfFixup -> dc=example,dc=net PluginUniqueConfigure -> dc=example,dc=net, mail 401a7084-eaf5-1039-866c-dbfbf2f5e6dd PluginUniqueConfigure -> dc=example,dc=net, uid 401a7084-eaf5-1039-866c-dbfbf2f5e6dd DatabaseLdifImport -> dc=example,dc=com /export/tests/suites/openldap_2_389/../../data/openldap_2_389/1/example_com.slapcat.ldif ==== end migration plan ====
-------------------------------Captured log call--------------------------------
INFO  lib389.migrate.openldap.config:config.py:264 Examining OpenLDAP Configuration ... INFO  lib389.migrate.openldap.config:config.py:285 Completed OpenLDAP Configuration Parsing. INFO  lib389.migrate.plan:plan.py:656 migration: 1 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 2 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 3 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 4 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 5 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 6 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 7 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 8 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 9 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 10 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 11 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 12 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 13 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 14 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 15 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 16 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 17 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 18 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 19 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 20 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 21 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 22 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 23 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 24 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 25 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 26 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 27 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 28 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 29 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 30 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 31 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 32 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 33 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 34 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 35 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 36 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 37 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 38 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 39 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 40 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 41 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 42 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 43 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 44 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 45 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 46 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 47 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 48 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 49 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 50 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 51 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 52 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 53 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 54 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 55 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 1 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 2 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 3 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 4 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 5 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 6 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 7 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 8 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 9 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 10 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 11 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 12 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 13 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 14 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 15 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 16 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 17 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 18 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 19 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 20 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 21 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 22 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 23 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 24 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 25 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 26 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 27 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 28 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 29 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 30 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 31 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 32 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 33 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 34 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 35 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 36 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 37 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 38 / 55 complete ... INFO  lib389:tasks.py:798 Index task index_all_10202020_201340 completed successfully INFO  lib389.migrate.plan:plan.py:663 post: 39 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 40 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 41 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 42 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 43 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 44 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 45 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 46 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 47 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 48 / 55 complete ... INFO  lib389:tasks.py:798 Index task index_all_10202020_201342 completed successfully INFO  lib389.migrate.plan:plan.py:663 post: 49 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 50 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 51 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 52 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 53 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 54 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 55 / 55 complete ...
Passed suites/paged_results/paged_results_test.py::test_search_success[6-5] 1.20
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:56 Adding user simplepaged_test
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 5 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:247 Set user bind simplepaged_test INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:191 Running simple paged result search with - search suffix: dc=example,dc=com; filter: (uid=test*); attr list ['dn', 'sn']; page_size = 6; controls: [<ldap.controls.libldap.SimplePagedResultsControl object at 0x7fa5f665abe0>]. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:253 5 results INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 5 users
Passed suites/paged_results/paged_results_test.py::test_search_success[5-5] 0.23
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 5 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:247 Set user bind simplepaged_test INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:191 Running simple paged result search with - search suffix: dc=example,dc=com; filter: (uid=test*); attr list ['dn', 'sn']; page_size = 5; controls: [<ldap.controls.libldap.SimplePagedResultsControl object at 0x7fa5f6678a00>]. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:253 5 results INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 5 users
Passed suites/paged_results/paged_results_test.py::test_search_success[5-25] 1.05
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 25 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:247 Set user bind simplepaged_test INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:191 Running simple paged result search with - search suffix: dc=example,dc=com; filter: (uid=test*); attr list ['dn', 'sn']; page_size = 5; controls: [<ldap.controls.libldap.SimplePagedResultsControl object at 0x7fa5f665a3a0>]. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 1 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 2 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 3 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 4 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:253 25 results INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 25 users
Passed suites/paged_results/paged_results_test.py::test_search_limits_fail[50-200-cn=config,cn=ldbm database,cn=plugins,cn=config-nsslapd-idlistscanlimit-100-UNWILLING_TO_PERFORM] 9.26
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 200 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-idlistscanlimit to 100. Previous value - b'4000'. Modified suffix - cn=config,cn=ldbm database,cn=plugins,cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:299 Set user bind INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:302 Create simple paged results control instance INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:308 Initiate ldapsearch with created control instance INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:320 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 200 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-idlistscanlimit to b'4000'. Previous value - b'100'. Modified suffix - cn=config,cn=ldbm database,cn=plugins,cn=config.
Passed suites/paged_results/paged_results_test.py::test_search_limits_fail[5-15-cn=config-nsslapd-timelimit-20-UNAVAILABLE_CRITICAL_EXTENSION] 30.95
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 15 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-timelimit to 20. Previous value - b'3600'. Modified suffix - cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:299 Set user bind INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:302 Create simple paged results control instance INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:308 Initiate ldapsearch with created control instance INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:320 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 15 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-timelimit to b'3600'. Previous value - b'20'. Modified suffix - cn=config.
Passed suites/paged_results/paged_results_test.py::test_search_limits_fail[21-50-cn=config-nsslapd-sizelimit-20-SIZELIMIT_EXCEEDED] 2.23
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 50 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-sizelimit to 20. Previous value - b'2000'. Modified suffix - cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:299 Set user bind INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:302 Create simple paged results control instance INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:308 Initiate ldapsearch with created control instance INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:320 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 50 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-sizelimit to b'2000'. Previous value - b'20'. Modified suffix - cn=config.
Passed suites/paged_results/paged_results_test.py::test_search_limits_fail[21-50-cn=config-nsslapd-pagedsizelimit-5-SIZELIMIT_EXCEEDED] 2.55
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 50 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-pagedsizelimit to 5. Previous value - b'0'. Modified suffix - cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:299 Set user bind INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:302 Create simple paged results control instance INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:308 Initiate ldapsearch with created control instance INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:320 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 50 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-pagedsizelimit to b'0'. Previous value - b'5'. Modified suffix - cn=config.
Passed suites/paged_results/paged_results_test.py::test_search_limits_fail[5-50-cn=config,cn=ldbm database,cn=plugins,cn=config-nsslapd-lookthroughlimit-20-ADMINLIMIT_EXCEEDED] 2.30
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 50 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-lookthroughlimit to 20. Previous value - b'5000'. Modified suffix - cn=config,cn=ldbm database,cn=plugins,cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:299 Set user bind INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:302 Create simple paged results control instance INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:308 Initiate ldapsearch with created control instance INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:320 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 50 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-lookthroughlimit to b'5000'. Previous value - b'20'. Modified suffix - cn=config,cn=ldbm database,cn=plugins,cn=config.
Passed suites/paged_results/paged_results_test.py::test_search_sort_success 2.45
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 50 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:379 Initiate ldapsearch with created control instance INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:380 Collect data with sorting INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:191 Running simple paged result search with - search suffix: dc=example,dc=com; filter: (uid=test*); attr list ['dn', 'sn']; page_size = 5; controls: [<ldap.controls.libldap.SimplePagedResultsControl object at 0x7fa5f6519a30>, <lib389._controls.SSSRequestControl object at 0x7fa5f65190a0>]. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 1 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 2 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 3 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 4 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 5 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 6 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 7 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 8 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 9 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:385 Substring numbers from user DNs INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:389 Assert that list is sorted INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 50 users
Passed suites/paged_results/paged_results_test.py::test_search_abandon 5.42
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 10 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:420 Set user bind INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:423 Create simple paged results control instance INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:427 Initiate a search with a paged results control INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:430 Abandon the search INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:433 Expect an ldap.TIMEOUT exception, while trying to get the search results INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 10 users
Passed suites/paged_results/paged_results_test.py::test_search_with_timelimit 34.10
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 100 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:469 Set user bind INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:472 Create simple paged results control instance INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:477 Iteration 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:484 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:484 Getting page 1 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:500 Done with this search - sleeping 10 seconds INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:477 Iteration 1 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:484 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:484 Getting page 1 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:500 Done with this search - sleeping 10 seconds INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:477 Iteration 2 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:484 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:484 Getting page 1 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:500 Done with this search - sleeping 10 seconds INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 100 users
Passed suites/paged_results/paged_results_test.py::test_search_dns_ip_aci[fqdn] 4.63
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 100 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:548 Back up current suffix ACI INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:551 Add test ACI INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:557 Set user bind INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:560 Create simple paged results control instance INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:564 Initiate three searches with a paged results control INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:566 1 search INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:191 Running simple paged result search with - search suffix: dc=example,dc=com; filter: (uid=test*); attr list ['dn', 'sn']; page_size = 5; controls: [<ldap.controls.libldap.SimplePagedResultsControl object at 0x7fa5f6665700>]. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 1 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 2 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 3 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 4 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 5 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 6 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 7 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 8 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 9 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 10 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 11 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 12 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 13 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 14 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 15 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 16 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 17 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 18 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 19 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:569 100 results INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:566 2 search INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:191 Running simple paged result search with - search suffix: dc=example,dc=com; filter: (uid=test*); attr list ['dn', 'sn']; page_size = 5; controls: [<ldap.controls.libldap.SimplePagedResultsControl object at 0x7fa5f6665700>]. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 1 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 2 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 3 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 4 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 5 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 6 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 7 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 8 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 9 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 10 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 11 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 12 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 13 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 14 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 15 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 16 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 17 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 18 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 19 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:569 100 results INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:566 3 search INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:191 Running simple paged result search with - search suffix: dc=example,dc=com; filter: (uid=test*); attr list ['dn', 'sn']; page_size = 5; controls: [<ldap.controls.libldap.SimplePagedResultsControl object at 0x7fa5f6665700>]. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 1 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 2 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 3 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 4 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 5 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 6 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 7 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 8 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 9 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 10 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 11 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 12 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 13 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 14 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 15 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 16 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 17 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 18 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 19 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:569 100 results INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:571 If we are here, then no error has happened. We are good. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:574 Restore ACI INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 100 users
Passed suites/paged_results/paged_results_test.py::test_search_dns_ip_aci[ip] 4.20
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 100 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:548 Back up current suffix ACI INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:551 Add test ACI INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:557 Set user bind INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:560 Create simple paged results control instance INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:564 Initiate three searches with a paged results control INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:566 1 search INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:191 Running simple paged result search with - search suffix: dc=example,dc=com; filter: (uid=test*); attr list ['dn', 'sn']; page_size = 5; controls: [<ldap.controls.libldap.SimplePagedResultsControl object at 0x7fa5f66a2b80>]. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 1 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 2 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 3 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 4 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 5 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 6 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 7 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 8 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 9 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 10 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 11 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 12 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 13 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 14 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 15 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 16 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 17 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 18 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 19 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:569 100 results INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:566 2 search INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:191 Running simple paged result search with - search suffix: dc=example,dc=com; filter: (uid=test*); attr list ['dn', 'sn']; page_size = 5; controls: [<ldap.controls.libldap.SimplePagedResultsControl object at 0x7fa5f66a2b80>]. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 1 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 2 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 3 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 4 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 5 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 6 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 7 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 8 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 9 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 10 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 11 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 12 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 13 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 14 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 15 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 16 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 17 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 18 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 19 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:569 100 results INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:566 3 search INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:191 Running simple paged result search with - search suffix: dc=example,dc=com; filter: (uid=test*); attr list ['dn', 'sn']; page_size = 5; controls: [<ldap.controls.libldap.SimplePagedResultsControl object at 0x7fa5f66a2b80>]. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 1 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 2 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 3 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 4 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 5 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 6 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 7 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 8 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 9 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 10 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 11 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 12 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 13 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 14 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 15 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 16 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 17 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 18 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 19 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:569 100 results INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:571 If we are here, then no error has happened. We are good. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:574 Restore ACI INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 100 users
Passed suites/paged_results/paged_results_test.py::test_search_multiple_paging 4.70
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 100 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:607 Set user bind INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:610 Create simple paged results control instance INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:615 Iteration 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:615 Iteration 1 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:615 Iteration 2 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 100 users
Passed suites/paged_results/paged_results_test.py::test_search_invalid_cookie[1000] 4.34
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 100 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:662 Set user bind INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:665 Create simple paged results control instance INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:673 Put an invalid cookie (1000) to the control. TypeError is expected INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 100 users
Passed suites/paged_results/paged_results_test.py::test_search_invalid_cookie[-1] 4.64
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 100 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:662 Set user bind INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:665 Create simple paged results control instance INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:673 Put an invalid cookie (-1) to the control. TypeError is expected INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 100 users
Passed suites/paged_results/paged_results_test.py::test_search_abandon_with_zero_size 0.43
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 10 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:706 Set user bind INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:709 Create simple paged results control instance INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 10 users
Passed suites/paged_results/paged_results_test.py::test_search_pagedsizelimit_success 0.65
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-pagedsizelimit to 20. Previous value - b'0'. Modified suffix - cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 10 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:755 Set user bind INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:191 Running simple paged result search with - search suffix: dc=example,dc=com; filter: (uid=test*); attr list ['dn', 'sn']; page_size = 10; controls: [<ldap.controls.libldap.SimplePagedResultsControl object at 0x7fa5f6679580>]. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:763 10 results INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 10 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-pagedsizelimit to b'0'. Previous value - b'20'. Modified suffix - cn=config.
Passed suites/paged_results/paged_results_test.py::test_search_nspagedsizelimit[5-15-PASS] 0.56
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 10 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-pagedsizelimit to 5. Previous value - b'0'. Modified suffix - cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsPagedSizeLimit to 15. Previous value - None. Modified suffix - uid=simplepaged_test,ou=People,dc=example,dc=com. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:818 Set user bind INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:829 Expect to pass INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:191 Running simple paged result search with - search suffix: dc=example,dc=com; filter: (uid=test*); attr list ['dn', 'sn']; page_size = 10; controls: [<ldap.controls.libldap.SimplePagedResultsControl object at 0x7fa5f6636790>]. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:831 10 results INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 10 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-pagedsizelimit to b'0'. Previous value - b'5'. Modified suffix - cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsPagedSizeLimit to None. Previous value - b'15'. Modified suffix - uid=simplepaged_test,ou=People,dc=example,dc=com.
Passed suites/paged_results/paged_results_test.py::test_search_nspagedsizelimit[15-5-SIZELIMIT_EXCEEDED] 0.66
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 10 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-pagedsizelimit to 15. Previous value - b'0'. Modified suffix - cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsPagedSizeLimit to 5. Previous value - None. Modified suffix - uid=simplepaged_test,ou=People,dc=example,dc=com. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:818 Set user bind INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:825 Expect to fail with SIZELIMIT_EXCEEDED INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:191 Running simple paged result search with - search suffix: dc=example,dc=com; filter: (uid=test*); attr list ['dn', 'sn']; page_size = 10; controls: [<ldap.controls.libldap.SimplePagedResultsControl object at 0x7fa5f6695640>]. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 10 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-pagedsizelimit to b'0'. Previous value - b'15'. Modified suffix - cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsPagedSizeLimit to None. Previous value - b'5'. Modified suffix - uid=simplepaged_test,ou=People,dc=example,dc=com.
Passed suites/paged_results/paged_results_test.py::test_search_paged_limits[conf_attr_values0-ADMINLIMIT_EXCEEDED] 4.25
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 101 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-sizelimit to 5000. Previous value - b'2000'. Modified suffix - cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-pagedsizelimit to 5000. Previous value - b'0'. Modified suffix - cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-idlistscanlimit to 100. Previous value - b'4000'. Modified suffix - cn=config,cn=ldbm database,cn=plugins,cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-lookthroughlimit to 100. Previous value - b'5000'. Modified suffix - cn=config,cn=ldbm database,cn=plugins,cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:890 Set user bind INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:897 Expect to fail with ADMINLIMIT_EXCEEDED INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:191 Running simple paged result search with - search suffix: dc=example,dc=com; filter: (uid=test*); attr list ['dn', 'sn']; page_size = 10; controls: [<ldap.controls.libldap.SimplePagedResultsControl object at 0x7fa5f593f580>]. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 1 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 2 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 3 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 4 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 5 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 6 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 7 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 8 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 101 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-sizelimit to b'2000'. Previous value - b'5000'. Modified suffix - cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-pagedsizelimit to b'0'. Previous value - b'5000'. Modified suffix - cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-lookthroughlimit to b'5000'. Previous value - b'100'. Modified suffix - cn=config,cn=ldbm database,cn=plugins,cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-idlistscanlimit to b'4000'. Previous value - b'100'. Modified suffix - cn=config,cn=ldbm database,cn=plugins,cn=config.
Passed suites/paged_results/paged_results_test.py::test_search_paged_limits[conf_attr_values1-PASS] 4.62
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 101 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-sizelimit to 5000. Previous value - b'2000'. Modified suffix - cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-pagedsizelimit to 5000. Previous value - b'0'. Modified suffix - cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-idlistscanlimit to 120. Previous value - b'4000'. Modified suffix - cn=config,cn=ldbm database,cn=plugins,cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-lookthroughlimit to 122. Previous value - b'5000'. Modified suffix - cn=config,cn=ldbm database,cn=plugins,cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:890 Set user bind INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:901 Expect to pass INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:191 Running simple paged result search with - search suffix: dc=example,dc=com; filter: (uid=test*); attr list ['dn', 'sn']; page_size = 10; controls: [<ldap.controls.libldap.SimplePagedResultsControl object at 0x7fa5f663afa0>]. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 1 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 2 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 3 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 4 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 5 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 6 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 7 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 8 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 9 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 10 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:903 101 results INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 101 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-sizelimit to b'2000'. Previous value - b'5000'. Modified suffix - cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-pagedsizelimit to b'0'. Previous value - b'5000'. Modified suffix - cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-lookthroughlimit to b'5000'. Previous value - b'122'. Modified suffix - cn=config,cn=ldbm database,cn=plugins,cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-idlistscanlimit to b'4000'. Previous value - b'120'. Modified suffix - cn=config,cn=ldbm database,cn=plugins,cn=config.
Passed suites/paged_results/paged_results_test.py::test_search_paged_user_limits[conf_attr_values0-ADMINLIMIT_EXCEEDED] 4.16
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 101 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-lookthroughlimit to 1000. Previous value - b'5000'. Modified suffix - cn=config,cn=ldbm database,cn=plugins,cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-idlistscanlimit to 1000. Previous value - b'4000'. Modified suffix - cn=config,cn=ldbm database,cn=plugins,cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsPagedIDListScanLimit to 100. Previous value - None. Modified suffix - uid=simplepaged_test,ou=People,dc=example,dc=com. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsPagedLookthroughLimit to 100. Previous value - None. Modified suffix - uid=simplepaged_test,ou=People,dc=example,dc=com. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:964 Set user bind INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:971 Expect to fail with ADMINLIMIT_EXCEEDED INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:191 Running simple paged result search with - search suffix: dc=example,dc=com; filter: (uid=test*); attr list ['dn', 'sn']; page_size = 10; controls: [<ldap.controls.libldap.SimplePagedResultsControl object at 0x7fa5f593eeb0>]. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 1 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 2 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 3 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 4 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 5 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 6 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 7 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 8 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 101 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-lookthroughlimit to b'5000'. Previous value - b'1000'. Modified suffix - cn=config,cn=ldbm database,cn=plugins,cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-idlistscanlimit to b'4000'. Previous value - b'1000'. Modified suffix - cn=config,cn=ldbm database,cn=plugins,cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsPagedIDListScanLimit to None. Previous value - b'100'. Modified suffix - uid=simplepaged_test,ou=People,dc=example,dc=com. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsPagedLookthroughLimit to None. Previous value - b'100'. Modified suffix - uid=simplepaged_test,ou=People,dc=example,dc=com.
Passed suites/paged_results/paged_results_test.py::test_search_paged_user_limits[conf_attr_values1-PASS] 4.42
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 101 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-lookthroughlimit to 1000. Previous value - b'5000'. Modified suffix - cn=config,cn=ldbm database,cn=plugins,cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-idlistscanlimit to 1000. Previous value - b'4000'. Modified suffix - cn=config,cn=ldbm database,cn=plugins,cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsPagedIDListScanLimit to 120. Previous value - None. Modified suffix - uid=simplepaged_test,ou=People,dc=example,dc=com. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsPagedLookthroughLimit to 122. Previous value - None. Modified suffix - uid=simplepaged_test,ou=People,dc=example,dc=com. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:964 Set user bind INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:975 Expect to pass INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:191 Running simple paged result search with - search suffix: dc=example,dc=com; filter: (uid=test*); attr list ['dn', 'sn']; page_size = 10; controls: [<ldap.controls.libldap.SimplePagedResultsControl object at 0x7fa5f6664d00>]. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 1 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 2 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 3 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 4 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 5 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 6 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 7 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 8 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 9 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 10 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:977 101 results INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 101 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-lookthroughlimit to b'5000'. Previous value - b'1000'. Modified suffix - cn=config,cn=ldbm database,cn=plugins,cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-idlistscanlimit to b'4000'. Previous value - b'1000'. Modified suffix - cn=config,cn=ldbm database,cn=plugins,cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsPagedIDListScanLimit to None. Previous value - b'120'. Modified suffix - uid=simplepaged_test,ou=People,dc=example,dc=com. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsPagedLookthroughLimit to None. Previous value - b'122'. Modified suffix - uid=simplepaged_test,ou=People,dc=example,dc=com.
Passed suites/paged_results/paged_results_test.py::test_ger_basic 0.86
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 20 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:191 Running simple paged result search with - search suffix: dc=example,dc=com; filter: (uid=test*); attr list ['dn', 'sn']; page_size = 4; controls: [<ldap.controls.libldap.SimplePagedResultsControl object at 0x7fa5f66528b0>, <ldap.controls.simple.GetEffectiveRightsControl object at 0x7fa5f6664700>]. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 1 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 2 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 3 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 4 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 5 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:1015 20 results INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:1017 Check for attributeLevelRights INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:1020 Remove added users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 20 users
Passed suites/paged_results/paged_results_test.py::test_multi_suffix_search 8.66
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:87 Adding suffix:o=test_parent and backend: parent_base INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:96 Adding ACI to allow our test user to search INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:108 Adding suffix:ou=child,o=test_parent and backend: child_base
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:1051 Clear the access log INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 10 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 10 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:191 Running simple paged result search with - search suffix: o=test_parent; filter: (uid=test*); attr list ['dn', 'sn']; page_size = 4; controls: [<ldap.controls.libldap.SimplePagedResultsControl object at 0x7fa5f6509970>]. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 1 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 2 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 3 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 4 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 5 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:1062 20 results INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:1065 Restart the server to flush the logs INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:1071 Assert that last pr_cookie == -1 and others pr_cookie == 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:1076 Remove added users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 10 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 10 users
Passed suites/paged_results/paged_results_test.py::test_maxsimplepaged_per_conn_success[None] 1.21
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 20 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:1109 Set user bind INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:191 Running simple paged result search with - search suffix: dc=example,dc=com; filter: (uid=test*); attr list ['dn', 'sn']; page_size = 4; controls: [<ldap.controls.libldap.SimplePagedResultsControl object at 0x7fa5f6519730>]. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 1 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 2 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 3 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 4 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:1116 20 results INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:1119 Remove added users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 20 users
Passed suites/paged_results/paged_results_test.py::test_maxsimplepaged_per_conn_success[-1] 0.87
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 20 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-maxsimplepaged-per-conn to -1. Previous value - b'-1'. Modified suffix - cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:1109 Set user bind INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:191 Running simple paged result search with - search suffix: dc=example,dc=com; filter: (uid=test*); attr list ['dn', 'sn']; page_size = 4; controls: [<ldap.controls.libldap.SimplePagedResultsControl object at 0x7fa5f665f1c0>]. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 1 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 2 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 3 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 4 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:1116 20 results INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:1119 Remove added users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 20 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-maxsimplepaged-per-conn to b'-1'. Previous value - b'-1'. Modified suffix - cn=config.
Passed suites/paged_results/paged_results_test.py::test_maxsimplepaged_per_conn_success[1000] 1.02
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 20 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-maxsimplepaged-per-conn to 1000. Previous value - b'-1'. Modified suffix - cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:1109 Set user bind INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:191 Running simple paged result search with - search suffix: dc=example,dc=com; filter: (uid=test*); attr list ['dn', 'sn']; page_size = 4; controls: [<ldap.controls.libldap.SimplePagedResultsControl object at 0x7fa5f668c640>]. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 1 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 2 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 3 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 4 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:1116 20 results INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:1119 Remove added users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 20 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-maxsimplepaged-per-conn to b'-1'. Previous value - b'1000'. Modified suffix - cn=config.
Passed suites/paged_results/paged_results_test.py::test_maxsimplepaged_per_conn_failure[0] 1.16
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 20 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-maxsimplepaged-per-conn to 0. Previous value - b'-1'. Modified suffix - cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:1154 Set user bind INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:1157 Create simple paged results control instance INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:1172 Remove added users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 20 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-maxsimplepaged-per-conn to b'-1'. Previous value - b'0'. Modified suffix - cn=config.
Passed suites/paged_results/paged_results_test.py::test_maxsimplepaged_per_conn_failure[1] 0.88
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 20 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-maxsimplepaged-per-conn to 1. Previous value - b'-1'. Modified suffix - cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:1154 Set user bind INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:1157 Create simple paged results control instance INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:1172 Remove added users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 20 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-maxsimplepaged-per-conn to b'-1'. Previous value - b'1'. Modified suffix - cn=config.
Passed suites/password/password_policy_test.py::test_password_change_section 1.11
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/password/password_policy_test.py::test_password_syntax_section 2.49
No log output captured.
Passed suites/password/password_policy_test.py::test_password_history_section 8.83
No log output captured.
Passed suites/password/password_policy_test.py::test_password_minimum_age_section 11.26
No log output captured.
Passed suites/password/password_policy_test.py::test_account_lockout_and_lockout_duration_section 6.35
No log output captured.
Passed suites/password/password_policy_test.py::test_grace_limit_section 16.94
No log output captured.
Passed suites/password/password_policy_test.py::test_additional_corner_cases 1.29
No log output captured.
Passed suites/password/password_test.py::test_password_delete_specific_password 0.54
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.password_test:password_test.py:44 Running test_password_delete_specific_password... INFO  tests.suites.password.password_test:password_test.py:65 test_password_delete_specific_password: PASSED
Passed suites/password/pbkdf2_upgrade_plugin_test.py::test_pbkdf2_upgrade 9.62
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/password/pwdAdmin_test.py::test_pwdAdmin_bypass 0.15
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created. INFO  tests.suites.password.pwdAdmin_test:pwdAdmin_test.py:46 test_pwdAdmin_init: Creating Password Administrator entries... INFO  tests.suites.password.pwdAdmin_test:pwdAdmin_test.py:85 test_pwdAdmin_init: Configuring password policy... INFO  tests.suites.password.pwdAdmin_test:pwdAdmin_test.py:100 Add aci to allow password admin to add/update entries... INFO  tests.suites.password.pwdAdmin_test:pwdAdmin_test.py:115 test_pwdAdmin_init: Bind as the Password Administrator (before activating)... INFO  tests.suites.password.pwdAdmin_test:pwdAdmin_test.py:128 test_pwdAdmin_init: Attempt to add entries with invalid passwords, these adds should fail...
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdAdmin_test:pwdAdmin_test.py:167 test_pwdAdmin: Activate the Password Administator...
Passed suites/password/pwdAdmin_test.py::test_pwdAdmin_no_admin 0.10
No log output captured.
Passed suites/password/pwdAdmin_test.py::test_pwdAdmin_modify 0.16
No log output captured.
Passed suites/password/pwdAdmin_test.py::test_pwdAdmin_group 0.17
No log output captured.
Passed suites/password/pwdAdmin_test.py::test_pwdAdmin_config_validation 0.03
No log output captured.
Passed suites/password/pwdModify_test.py::test_pwd_modify_with_different_operation 15.29
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:101 Attempt for Password change for an entry that does not exists INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:103 Attempt for Password change for an entry that exists INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:105 Attempt for Password change to old for an entry that exists INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:107 Attempt for Password Change with Binddn as testuser but with wrong old password INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:111 Attempt for Password Change with Binddn as testuser INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:113 Attempt for Password Change without giving newpassword INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:116 Change password to NEW_PASSWD i.e newpassword INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:119 Check binding with old/new password INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:124 Change password back to OLD_PASSWD i.e password INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:127 Checking password change Operation using a Non-Secure connection INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:131 Testuser attempts to change password for testuser2(userPassword attribute is Set) INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:147 Directory Manager attempts to change password for testuser2(userPassword attribute is Set) INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:150 Changing userPassword attribute to Undefined for testuser2 INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:152 Testuser attempts to change password for testuser2(userPassword attribute is Undefined) INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:156 Directory Manager attempts to change password for testuser2(userPassword attribute is Undefined) INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:159 Create a password syntax policy. Attempt to change to password that violates that policy INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:163 Reset password syntax policy INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:165 userPassword mod with control results in ber decode error INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:169 Reseting the testuser's password
Passed suites/password/pwdModify_test.py::test_pwd_modify_with_password_policy 0.12
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:44 Change the pwd storage type to clear and change the password once to refresh it(for the rest of tests
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:192 Change the password and check that a new entry has been added to the history INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:196 Try changing password to one stored in history. Should fail INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:199 Change the password several times in a row, and try binding after each change
Passed suites/password/pwdModify_test.py::test_pwd_modify_with_subsuffix 0.11
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:228 Add a new SubSuffix INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:235 Add the container & create password policies INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:243 Add two New users under the SubEntry INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:265 Changing password of uid=test_user0,ou=TestPeople_bug834047,dc=example,dc=com to newpassword INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:270 Try to delete password- case when password is specified INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:274 Try to delete password- case when password is not specified
Passed suites/password/pwdPolicy_attribute_test.py::test_pwd_reset 1.89
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created. INFO  tests.suites.password.pwdPolicy_attribute_test:pwdPolicy_attribute_test.py:39 Adding test user {}
Passed suites/password/pwdPolicy_attribute_test.py::test_change_pwd[on-off-UNWILLING_TO_PERFORM] 2.17
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.password.pwdPolicy_attribute_test:pwdPolicy_attribute_test.py:66 Create password policy for subtree ou=people,dc=example,dc=com INFO  tests.suites.password.pwdPolicy_attribute_test:pwdPolicy_attribute_test.py:69 Create password policy for user uid=simplepaged_test,ou=people,dc=example,dc=com
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdPolicy_attribute_test:pwdPolicy_attribute_test.py:154 Set passwordChange to "on" - ou=people,dc=example,dc=com INFO  tests.suites.password.pwdPolicy_attribute_test:pwdPolicy_attribute_test.py:161 Set passwordChange to "off" - uid=simplepaged_test,ou=people,dc=example,dc=com INFO  tests.suites.password.pwdPolicy_attribute_test:pwdPolicy_attribute_test.py:170 Bind as user and modify userPassword INFO  tests.suites.password.pwdPolicy_attribute_test:pwdPolicy_attribute_test.py:182 Bind as DM
Passed suites/password/pwdPolicy_attribute_test.py::test_change_pwd[off-off-UNWILLING_TO_PERFORM] 2.17
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdPolicy_attribute_test:pwdPolicy_attribute_test.py:154 Set passwordChange to "off" - ou=people,dc=example,dc=com INFO  tests.suites.password.pwdPolicy_attribute_test:pwdPolicy_attribute_test.py:161 Set passwordChange to "off" - uid=simplepaged_test,ou=people,dc=example,dc=com INFO  tests.suites.password.pwdPolicy_attribute_test:pwdPolicy_attribute_test.py:170 Bind as user and modify userPassword INFO  tests.suites.password.pwdPolicy_attribute_test:pwdPolicy_attribute_test.py:182 Bind as DM
Passed suites/password/pwdPolicy_attribute_test.py::test_change_pwd[off-on-False] 2.21
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdPolicy_attribute_test:pwdPolicy_attribute_test.py:154 Set passwordChange to "off" - ou=people,dc=example,dc=com INFO  tests.suites.password.pwdPolicy_attribute_test:pwdPolicy_attribute_test.py:161 Set passwordChange to "on" - uid=simplepaged_test,ou=people,dc=example,dc=com INFO  tests.suites.password.pwdPolicy_attribute_test:pwdPolicy_attribute_test.py:170 Bind as user and modify userPassword INFO  tests.suites.password.pwdPolicy_attribute_test:pwdPolicy_attribute_test.py:182 Bind as DM
Passed suites/password/pwdPolicy_attribute_test.py::test_change_pwd[on-on-False] 2.23
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdPolicy_attribute_test:pwdPolicy_attribute_test.py:154 Set passwordChange to "on" - ou=people,dc=example,dc=com INFO  tests.suites.password.pwdPolicy_attribute_test:pwdPolicy_attribute_test.py:161 Set passwordChange to "on" - uid=simplepaged_test,ou=people,dc=example,dc=com INFO  tests.suites.password.pwdPolicy_attribute_test:pwdPolicy_attribute_test.py:170 Bind as user and modify userPassword INFO  tests.suites.password.pwdPolicy_attribute_test:pwdPolicy_attribute_test.py:182 Bind as DM
Passed suites/password/pwdPolicy_attribute_test.py::test_pwd_min_age 14.50
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdPolicy_attribute_test:pwdPolicy_attribute_test.py:219 Set passwordminage to "10" - ou=people,dc=example,dc=com INFO  tests.suites.password.pwdPolicy_attribute_test:pwdPolicy_attribute_test.py:224 Set passwordminage to "10" - uid=simplepaged_test,ou=people,dc=example,dc=com INFO  tests.suites.password.pwdPolicy_attribute_test:pwdPolicy_attribute_test.py:228 Set passwordminage to "10" - cn=config INFO  tests.suites.password.pwdPolicy_attribute_test:pwdPolicy_attribute_test.py:233 Bind as user and modify userPassword INFO  tests.suites.password.pwdPolicy_attribute_test:pwdPolicy_attribute_test.py:239 Bind as user and modify userPassword straight away after previous change INFO  tests.suites.password.pwdPolicy_attribute_test:pwdPolicy_attribute_test.py:244 Wait 12 second INFO  tests.suites.password.pwdPolicy_attribute_test:pwdPolicy_attribute_test.py:248 Bind as user and modify userPassword INFO  tests.suites.password.pwdPolicy_attribute_test:pwdPolicy_attribute_test.py:256 Bind as DM
Passed suites/password/pwdPolicy_controls_test.py::test_pwd_must_change 2.69
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdPolicy_controls_test:pwdPolicy_controls_test.py:109 Configure password policy with paswordMustChange set to "on" INFO  tests.suites.password.pwdPolicy_controls_test:pwdPolicy_controls_test.py:120 Reset userpassword as Directory Manager INFO  tests.suites.password.pwdPolicy_controls_test:pwdPolicy_controls_test.py:125 Bind should return ctrl with error code 2 (changeAfterReset)
Passed suites/password/pwdPolicy_controls_test.py::test_pwd_expired_grace_limit 6.61
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdPolicy_controls_test:pwdPolicy_controls_test.py:159 Configure password policy with grace limit set tot 2 INFO  tests.suites.password.pwdPolicy_controls_test:pwdPolicy_controls_test.py:164 Change password and wait for it to expire INFO  tests.suites.password.pwdPolicy_controls_test:pwdPolicy_controls_test.py:168 Bind and use up one grace login (only one left) INFO  tests.suites.password.pwdPolicy_controls_test:pwdPolicy_controls_test.py:178 Use up last grace login, should get control INFO  tests.suites.password.pwdPolicy_controls_test:pwdPolicy_controls_test.py:184 No grace login available, bind should fail, and no control should be returned
Passed suites/password/pwdPolicy_controls_test.py::test_pwd_expiring_with_warning 5.53
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdPolicy_controls_test:pwdPolicy_controls_test.py:206 Configure password policy INFO  tests.suites.password.pwdPolicy_controls_test:pwdPolicy_controls_test.py:211 Change password and get controls INFO  tests.suites.password.pwdPolicy_controls_test:pwdPolicy_controls_test.py:222 Warning has been sent, try the bind again, and recheck the expiring time
Passed suites/password/pwdPolicy_controls_test.py::test_pwd_expiring_with_no_warning 6.54
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdPolicy_controls_test:pwdPolicy_controls_test.py:250 Configure password policy INFO  tests.suites.password.pwdPolicy_controls_test:pwdPolicy_controls_test.py:255 When the warning is less than the max age, we never send expiring control response INFO  tests.suites.password.pwdPolicy_controls_test:pwdPolicy_controls_test.py:262 Turn on sending expiring control regardless of warning INFO  tests.suites.password.pwdPolicy_controls_test:pwdPolicy_controls_test.py:274 Check expiring time again INFO  tests.suites.password.pwdPolicy_controls_test:pwdPolicy_controls_test.py:285 Turn off sending expiring control (restore the default setting)
Passed suites/password/pwdPolicy_inherit_global_test.py::test_entry_has_no_restrictions[off-off] 1.15
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created. INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:39 Adding user uid=buser,ou=People,dc=example,dc=com INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:46 Adding an aci for the bind user INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:71 Enable fine-grained policy INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:92 Default value of nsslapd-pwpolicy-inherit-global is off INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:92 Default value of passwordCheckSyntax is off
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:119 Set nsslapd-pwpolicy-inherit-global to off INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:120 Set passwordCheckSyntax to off INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:92 Default value of nsslapd-pwpolicy-inherit-global is off INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:92 Default value of passwordCheckSyntax is off INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:129 Bind as test user INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:132 Make sure an entry added to ou=people has no password syntax restrictions.
Passed suites/password/pwdPolicy_inherit_global_test.py::test_entry_has_no_restrictions[on-off] 1.04
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:119 Set nsslapd-pwpolicy-inherit-global to on INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:120 Set passwordCheckSyntax to off INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:92 Default value of nsslapd-pwpolicy-inherit-global is on INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:92 Default value of passwordCheckSyntax is off INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:129 Bind as test user INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:132 Make sure an entry added to ou=people has no password syntax restrictions.
Passed suites/password/pwdPolicy_inherit_global_test.py::test_entry_has_no_restrictions[off-on] 1.06
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:119 Set nsslapd-pwpolicy-inherit-global to off INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:120 Set passwordCheckSyntax to on INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:92 Default value of nsslapd-pwpolicy-inherit-global is off INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:92 Default value of passwordCheckSyntax is on INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:129 Bind as test user INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:132 Make sure an entry added to ou=people has no password syntax restrictions.
Passed suites/password/pwdPolicy_inherit_global_test.py::test_entry_has_restrictions 1.06
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:173 Set nsslapd-pwpolicy-inherit-global to on INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:174 Set passwordCheckSyntax to on INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:92 Default value of nsslapd-pwpolicy-inherit-global is on INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:92 Default value of passwordCheckSyntax is on INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:187 Bind as test user INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:192 Try to add user with a short password (<9) INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:197 Try to add user with a long password (>9) INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:201 Bind as DM user
Passed suites/password/pwdPolicy_token_test.py::test_token_lengths 3.73
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdPolicy_token_test:pwdPolicy_token_test.py:67 Testing password len 4 token (test) INFO  tests.suites.password.pwdPolicy_token_test:pwdPolicy_token_test.py:72 Password correctly rejected: {'msgtype': 103, 'msgid': 12, 'result': 19, 'desc': 'Constraint violation', 'ctrls': [], 'info': 'invalid password syntax - password based off of user entry'} INFO  tests.suites.password.pwdPolicy_token_test:pwdPolicy_token_test.py:67 Testing password len 6 token (test_u) INFO  tests.suites.password.pwdPolicy_token_test:pwdPolicy_token_test.py:72 Password correctly rejected: {'msgtype': 103, 'msgid': 16, 'result': 19, 'desc': 'Constraint violation', 'ctrls': [], 'info': 'invalid password syntax - password based off of user entry'} INFO  tests.suites.password.pwdPolicy_token_test:pwdPolicy_token_test.py:67 Testing password len 10 token (test_user1) INFO  tests.suites.password.pwdPolicy_token_test:pwdPolicy_token_test.py:72 Password correctly rejected: {'msgtype': 103, 'msgid': 20, 'result': 19, 'desc': 'Constraint violation', 'ctrls': [], 'info': 'invalid password syntax - password based off of user entry'}
Passed suites/password/pwdPolicy_warning_test.py::test_different_values[ ] 0.14
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:207 Get the default value INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:211 An invalid value is being tested INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:173 Setting passwordSendExpiringTime to INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:215 Now check the value is unchanged INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:218 Invalid value was rejected correctly
Passed suites/password/pwdPolicy_warning_test.py::test_different_values[junk123] 0.01
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:207 Get the default value INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:211 An invalid value is being tested INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:173 Setting passwordSendExpiringTime to junk123 INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:215 Now check the value is unchanged INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:218 Invalid value junk123 was rejected correctly
Passed suites/password/pwdPolicy_warning_test.py::test_different_values[on] 1.02
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:207 Get the default value INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:220 A valid value is being tested INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:173 Setting passwordSendExpiringTime to on INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:223 Now check that the value has been changed INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:226 passwordSendExpiringTime is now set to on INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:228 Set passwordSendExpiringTime back to the default value INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:173 Setting passwordSendExpiringTime to off
Passed suites/password/pwdPolicy_warning_test.py::test_different_values[off] 1.02
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:207 Get the default value INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:220 A valid value is being tested INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:173 Setting passwordSendExpiringTime to off INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:223 Now check that the value has been changed INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:226 passwordSendExpiringTime is now set to off INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:228 Set passwordSendExpiringTime back to the default value INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:173 Setting passwordSendExpiringTime to off
Passed suites/password/pwdPolicy_warning_test.py::test_expiry_time 0.11
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:47 Get the default values INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:54 Set the new values INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:115 Add the user
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:259 Get the password expiry warning time INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:260 Binding with (uid=tuser,ou=people,dc=example,dc=com) and requesting the password expiry warning time INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:160 Bind with the user and request the password expiry warning time INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:264 Check whether the time is returned INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:267 user's password will expire in 172800 seconds INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:270 Rebinding as DM
Passed suites/password/pwdPolicy_warning_test.py::test_password_warning[passwordSendExpiringTime-off] 0.64
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:47 Get the default values INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:54 Set the new values INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:115 Add the user
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:302 Set configuration parameter INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:173 Setting passwordSendExpiringTime to off INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:305 Binding with (uid=tuser,ou=people,dc=example,dc=com) and requesting password expiry warning time INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:160 Bind with the user and request the password expiry warning time INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:309 Check the state of the control INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:311 Password Expiry warning time is not returned as passwordSendExpiringTime is set to off INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:317 Rebinding as DM
Passed suites/password/pwdPolicy_warning_test.py::test_password_warning[passwordWarning-3600] 0.61
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:47 Get the default values INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:54 Set the new values INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:115 Add the user
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:302 Set configuration parameter INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:173 Setting passwordWarning to 3600 INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:305 Binding with (uid=tuser,ou=people,dc=example,dc=com) and requesting password expiry warning time INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:160 Bind with the user and request the password expiry warning time INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:309 Check the state of the control INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:314 (uid=tuser,ou=people,dc=example,dc=com) password will expire in 172799 seconds INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:317 Rebinding as DM
Passed suites/password/pwdPolicy_warning_test.py::test_with_different_password_states 0.26
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:47 Get the default values INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:54 Set the new values INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:115 Add the user
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:350 Expire user's password by changing passwordExpirationTime timestamp INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:354 Old passwordExpirationTime: 20201023002010Z INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:357 New passwordExpirationTime: 20200922002010Z INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:360 Attempting to bind with user uid=tuser,ou=people,dc=example,dc=com and retrive the password expiry warning time INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:160 Bind with the user and request the password expiry warning time INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:364 Bind Failed, error: <ExceptionInfo INVALID_CREDENTIALS({'msgtype': 97, 'msgid': 73, 'result': 49, 'desc': 'Invalid credentials', 'ctrls': [('1.3.6.1.4.1.....8.5.1', 0, b'0\x84\x00\x00\x00\x03\x81\x01\x00'), ('2.16.840.1.113730.3.4.4', 0, b'0')], 'info': 'password expired!'}) tblen=10> INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:366 Rebinding as DM INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:369 Reverting back user's passwordExpirationTime INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:373 Rebinding with uid=tuser,ou=people,dc=example,dc=com and retrieving the password expiry warning time INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:160 Bind with the user and request the password expiry warning time INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:376 Check that the control is returned INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:379 user's password will expire in 172800 seconds INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:382 Rebinding as DM
Passed suites/password/pwdPolicy_warning_test.py::test_default_behavior 0.11
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:85 Get the default values INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:91 Set the new values INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:115 Add the user
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:408 Binding with uid=tuser,ou=people,dc=example,dc=com and requesting the password expiry warning time INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:160 Bind with the user and request the password expiry warning time INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:412 Check that no control is returned INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:415 Rebinding as DM
Passed suites/password/pwdPolicy_warning_test.py::test_when_maxage_and_warning_are_the_same 2.40
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:85 Get the default values INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:91 Set the new values INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:115 Add the user
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:444 Set the new values INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:448 First change user's password to reset its password expiration time INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:455 Binding with uid=tuser,ou=people,dc=example,dc=com and requesting the password expiry warning time INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:160 Bind with the user and request the password expiry warning time INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:459 Check that control is returned evenif passwordSendExpiringTime is set to off INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:463 user's password will expire in 86400 seconds INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:465 Rebinding as DM
Passed suites/password/pwdPolicy_warning_test.py::test_with_local_policy 0.14
-----------------------------Captured stderr setup------------------------------
Container entries added.
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:47 Get the default values INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:54 Set the new values INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:115 Add the user INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:143 Setting fine grained policy for user (uid=tuser,ou=people,dc=example,dc=com)
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:492 Attempting to get password expiry warning time for user uid=tuser,ou=people,dc=example,dc=com INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:160 Bind with the user and request the password expiry warning time INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:495 Check that the control is not returned INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:498 Password expiry warning time is not returned INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:500 Rebinding as DM
Passed suites/password/pwdPolicy_warning_test.py::test_search_shadowWarning_when_passwordWarning_is_lower 0.20
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:47 Get the default values INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:54 Set the new values
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:532 Bind as cn=Directory Manager INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:535 Creating test user INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:540 Setting passwordWarning to smaller value than 86400 INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:543 Bind as test user INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:546 Check if attribute shadowWarning is present
Passed suites/password/pwdPolicy_warning_test.py::test_password_expire_works 1.64
No log output captured.
Passed suites/password/pwd_algo_test.py::test_pwd_algo_test[CLEAR] 0.33
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwd_algo_test:pwd_algo_test.py:138 Test CLEAR PASSED
Passed suites/password/pwd_algo_test.py::test_pwd_algo_test[CRYPT] 0.05
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwd_algo_test:pwd_algo_test.py:138 Test CRYPT PASSED
Passed suites/password/pwd_algo_test.py::test_pwd_algo_test[CRYPT-MD5] 0.04
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwd_algo_test:pwd_algo_test.py:138 Test CRYPT-MD5 PASSED
Passed suites/password/pwd_algo_test.py::test_pwd_algo_test[CRYPT-SHA256] 0.07
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwd_algo_test:pwd_algo_test.py:138 Test CRYPT-SHA256 PASSED
Passed suites/password/pwd_algo_test.py::test_pwd_algo_test[CRYPT-SHA512] 0.07
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwd_algo_test:pwd_algo_test.py:138 Test CRYPT-SHA512 PASSED
Passed suites/password/pwd_algo_test.py::test_pwd_algo_test[MD5] 0.05
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwd_algo_test:pwd_algo_test.py:138 Test MD5 PASSED
Passed suites/password/pwd_algo_test.py::test_pwd_algo_test[SHA] 0.20
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwd_algo_test:pwd_algo_test.py:138 Test SHA PASSED
Passed suites/password/pwd_algo_test.py::test_pwd_algo_test[SHA256] 0.05
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwd_algo_test:pwd_algo_test.py:138 Test SHA256 PASSED
Passed suites/password/pwd_algo_test.py::test_pwd_algo_test[SHA384] 0.05
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwd_algo_test:pwd_algo_test.py:138 Test SHA384 PASSED
Passed suites/password/pwd_algo_test.py::test_pwd_algo_test[SHA512] 0.40
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwd_algo_test:pwd_algo_test.py:138 Test SHA512 PASSED
Passed suites/password/pwd_algo_test.py::test_pwd_algo_test[SMD5] 0.05
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwd_algo_test:pwd_algo_test.py:138 Test SMD5 PASSED
Passed suites/password/pwd_algo_test.py::test_pwd_algo_test[SSHA] 0.06
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwd_algo_test:pwd_algo_test.py:138 Test SSHA PASSED
Passed suites/password/pwd_algo_test.py::test_pwd_algo_test[SSHA256] 0.05
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwd_algo_test:pwd_algo_test.py:138 Test SSHA256 PASSED
Passed suites/password/pwd_algo_test.py::test_pwd_algo_test[SSHA384] 0.05
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwd_algo_test:pwd_algo_test.py:138 Test SSHA384 PASSED
Passed suites/password/pwd_algo_test.py::test_pwd_algo_test[SSHA512] 0.05
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwd_algo_test:pwd_algo_test.py:138 Test SSHA512 PASSED
Passed suites/password/pwd_algo_test.py::test_pwd_algo_test[PBKDF2_SHA256] 0.25
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwd_algo_test:pwd_algo_test.py:138 Test PBKDF2_SHA256 PASSED
Passed suites/password/pwd_algo_test.py::test_pwd_algo_test[DEFAULT] 0.25
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwd_algo_test:pwd_algo_test.py:138 Test DEFAULT PASSED
Passed suites/password/pwd_algo_test.py::test_pbkdf2_algo 2.60
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwd_algo_test:pwd_algo_test.py:170 Test PASSED
Passed suites/password/pwd_lockout_bypass_test.py::test_lockout_bypass 0.69
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/password/pwd_log_test.py::test_hide_unhashed_pwd 4.37
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwd_log_test:pwd_log_test.py:79 Test complete
Passed suites/password/pwp_gracel_test.py::test_password_gracelimit_section 13.77
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/password/pwp_history_test.py::test_history_is_not_overwritten 4.62
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwp_history_test:pwp_history_test.py:77 Configured password policy.
Passed suites/password/pwp_history_test.py::test_basic 7.83
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwp_history_test:pwp_history_test.py:152 Configured password policy. INFO  tests.suites.password.pwp_history_test:pwp_history_test.py:170 Password change correctly rejected INFO  tests.suites.password.pwp_history_test:pwp_history_test.py:201 Correct number of passwords found in history. INFO  tests.suites.password.pwp_history_test:pwp_history_test.py:212 Password change correctly rejected INFO  tests.suites.password.pwp_history_test:pwp_history_test.py:222 Password change correctly rejected INFO  tests.suites.password.pwp_history_test:pwp_history_test.py:232 Password change correctly rejected INFO  tests.suites.password.pwp_history_test:pwp_history_test.py:254 Password change correctly rejected INFO  tests.suites.password.pwp_history_test:pwp_history_test.py:267 Configured passwordInHistory to 0. INFO  tests.suites.password.pwp_history_test:pwp_history_test.py:283 Password change correctly rejected INFO  tests.suites.password.pwp_history_test:pwp_history_test.py:299 Configured passwordInHistory to 2. INFO  tests.suites.password.pwp_history_test:pwp_history_test.py:312 Password change correctly rejected INFO  tests.suites.password.pwp_history_test:pwp_history_test.py:326 Test suite PASSED.
Passed suites/password/pwp_test.py::test_passwordchange_to_no 0.37
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/password/pwp_test.py::test_password_check_syntax 0.33
No log output captured.
Passed suites/password/pwp_test.py::test_too_big_password 0.42
No log output captured.
Passed suites/password/pwp_test.py::test_pwminage 3.12
No log output captured.
Passed suites/password/pwp_test.py::test_invalid_credentials 7.57
No log output captured.
Passed suites/password/pwp_test.py::test_expiration_date 1.23
No log output captured.
Passed suites/password/pwp_test.py::test_passwordlockout 2.37
No log output captured.
Passed suites/password/regression_of_bugs_test.py::test_local_password_policy 0.10
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/password/regression_of_bugs_test.py::test_passwordexpirationtime_attribute 3.05
No log output captured.
Passed suites/password/regression_of_bugs_test.py::test_admin_group_to_modify_password 0.90
No log output captured.
Passed suites/password/regression_of_bugs_test.py::test_password_max_failure_should_lockout_password 0.11
No log output captured.
Passed suites/password/regression_of_bugs_test.py::test_pwd_update_time_attribute 3.21
No log output captured.
Passed suites/password/regression_of_bugs_test.py::test_password_track_update_time 7.42
No log output captured.
Passed suites/password/regression_of_bugs_test.py::test_signal_11 0.04
No log output captured.
Passed suites/password/regression_test.py::test_pwp_local_unlock 4.11
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created. INFO  tests.suites.password.regression_test:regression_test.py:68 Configure Pwpolicy with PasswordCheckSyntax and nsslapd-pwpolicy-local set to on INFO  tests.suites.password.regression_test:regression_test.py:75 Configure subtree password policy for ou=people,dc=example,dc=com INFO  tests.suites.password.regression_test:regression_test.py:98 Adding user-uid=UIDpwtest1,ou=people,dc=example,dc=com
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:137 Verify user can bind... INFO  tests.suites.password.regression_test:regression_test.py:140 Test passwordUnlock default - user should be able to reset password after lockout INFO  tests.suites.password.regression_test:regression_test.py:151 Verify account is locked INFO  tests.suites.password.regression_test:regression_test.py:155 Wait for lockout duration... INFO  tests.suites.password.regression_test:regression_test.py:158 Check if user can now bind with correct password
Passed suites/password/regression_test.py::test_trivial_passw_check[CNpwtest1] 0.14
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:183 Replace userPassword attribute with CNpwtest1
Passed suites/password/regression_test.py::test_trivial_passw_check[SNpwtest1] 0.14
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:183 Replace userPassword attribute with SNpwtest1
Passed suites/password/regression_test.py::test_trivial_passw_check[UIDpwtest1] 0.19
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:183 Replace userPassword attribute with UIDpwtest1
Passed suites/password/regression_test.py::test_trivial_passw_check[MAILpwtest1@redhat.com] 0.14
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:183 Replace userPassword attribute with MAILpwtest1@redhat.com
Passed suites/password/regression_test.py::test_trivial_passw_check[GNpwtest1] 0.15
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:183 Replace userPassword attribute with GNpwtest1
Passed suites/password/regression_test.py::test_trivial_passw_check[CNpwtest1ZZZZ] 0.14
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:183 Replace userPassword attribute with CNpwtest1ZZZZ
Passed suites/password/regression_test.py::test_trivial_passw_check[ZZZZZCNpwtest1] 0.29
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:183 Replace userPassword attribute with ZZZZZCNpwtest1
Passed suites/password/regression_test.py::test_trivial_passw_check[ZCNpwtest1] 0.13
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:183 Replace userPassword attribute with ZCNpwtest1
Passed suites/password/regression_test.py::test_trivial_passw_check[CNpwtest1Z] 0.14
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:183 Replace userPassword attribute with CNpwtest1Z
Passed suites/password/regression_test.py::test_trivial_passw_check[ZCNpwtest1Z] 0.14
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:183 Replace userPassword attribute with ZCNpwtest1Z
Passed suites/password/regression_test.py::test_trivial_passw_check[ZZCNpwtest1] 0.15
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:183 Replace userPassword attribute with ZZCNpwtest1
Passed suites/password/regression_test.py::test_trivial_passw_check[CNpwtest1ZZ] 0.16
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:183 Replace userPassword attribute with CNpwtest1ZZ
Passed suites/password/regression_test.py::test_trivial_passw_check[ZZCNpwtest1ZZ] 0.14
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:183 Replace userPassword attribute with ZZCNpwtest1ZZ
Passed suites/password/regression_test.py::test_trivial_passw_check[ZZZCNpwtest1] 0.14
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:183 Replace userPassword attribute with ZZZCNpwtest1
Passed suites/password/regression_test.py::test_trivial_passw_check[CNpwtest1ZZZ] 0.14
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:183 Replace userPassword attribute with CNpwtest1ZZZ
Passed suites/password/regression_test.py::test_trivial_passw_check[ZZZCNpwtest1ZZZ] 0.14
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:183 Replace userPassword attribute with ZZZCNpwtest1ZZZ
Passed suites/password/regression_test.py::test_trivial_passw_check[ZZZZZZCNpwtest1ZZZZZZZZ] 0.15
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:183 Replace userPassword attribute with ZZZZZZCNpwtest1ZZZZZZZZ
Passed suites/password/regression_test.py::test_global_vs_local[CNpwtest1] 0.19
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:211 Configure Pwpolicy with PasswordCheckSyntax and nsslapd-pwpolicy-local set to off INFO  tests.suites.password.regression_test:regression_test.py:216 Replace userPassword attribute with CNpwtest1
Passed suites/password/regression_test.py::test_global_vs_local[SNpwtest1] 0.34
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:211 Configure Pwpolicy with PasswordCheckSyntax and nsslapd-pwpolicy-local set to off INFO  tests.suites.password.regression_test:regression_test.py:216 Replace userPassword attribute with SNpwtest1
Passed suites/password/regression_test.py::test_global_vs_local[UIDpwtest1] 0.44
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:211 Configure Pwpolicy with PasswordCheckSyntax and nsslapd-pwpolicy-local set to off INFO  tests.suites.password.regression_test:regression_test.py:216 Replace userPassword attribute with UIDpwtest1
Passed suites/password/regression_test.py::test_global_vs_local[MAILpwtest1@redhat.com] 0.20
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:211 Configure Pwpolicy with PasswordCheckSyntax and nsslapd-pwpolicy-local set to off INFO  tests.suites.password.regression_test:regression_test.py:216 Replace userPassword attribute with MAILpwtest1@redhat.com
Passed suites/password/regression_test.py::test_global_vs_local[GNpwtest1] 0.20
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:211 Configure Pwpolicy with PasswordCheckSyntax and nsslapd-pwpolicy-local set to off INFO  tests.suites.password.regression_test:regression_test.py:216 Replace userPassword attribute with GNpwtest1
Passed suites/password/regression_test.py::test_global_vs_local[CNpwtest1ZZZZ] 0.24
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:211 Configure Pwpolicy with PasswordCheckSyntax and nsslapd-pwpolicy-local set to off INFO  tests.suites.password.regression_test:regression_test.py:216 Replace userPassword attribute with CNpwtest1ZZZZ
Passed suites/password/regression_test.py::test_global_vs_local[ZZZZZCNpwtest1] 0.24
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:211 Configure Pwpolicy with PasswordCheckSyntax and nsslapd-pwpolicy-local set to off INFO  tests.suites.password.regression_test:regression_test.py:216 Replace userPassword attribute with ZZZZZCNpwtest1
Passed suites/password/regression_test.py::test_global_vs_local[ZCNpwtest1] 0.19
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:211 Configure Pwpolicy with PasswordCheckSyntax and nsslapd-pwpolicy-local set to off INFO  tests.suites.password.regression_test:regression_test.py:216 Replace userPassword attribute with ZCNpwtest1
Passed suites/password/regression_test.py::test_global_vs_local[CNpwtest1Z] 0.22
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:211 Configure Pwpolicy with PasswordCheckSyntax and nsslapd-pwpolicy-local set to off INFO  tests.suites.password.regression_test:regression_test.py:216 Replace userPassword attribute with CNpwtest1Z
Passed suites/password/regression_test.py::test_global_vs_local[ZCNpwtest1Z] 0.19
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:211 Configure Pwpolicy with PasswordCheckSyntax and nsslapd-pwpolicy-local set to off INFO  tests.suites.password.regression_test:regression_test.py:216 Replace userPassword attribute with ZCNpwtest1Z
Passed suites/password/regression_test.py::test_global_vs_local[ZZCNpwtest1] 0.20
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:211 Configure Pwpolicy with PasswordCheckSyntax and nsslapd-pwpolicy-local set to off INFO  tests.suites.password.regression_test:regression_test.py:216 Replace userPassword attribute with ZZCNpwtest1
Passed suites/password/regression_test.py::test_global_vs_local[CNpwtest1ZZ] 0.35
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:211 Configure Pwpolicy with PasswordCheckSyntax and nsslapd-pwpolicy-local set to off INFO  tests.suites.password.regression_test:regression_test.py:216 Replace userPassword attribute with CNpwtest1ZZ
Passed suites/password/regression_test.py::test_global_vs_local[ZZCNpwtest1ZZ] 0.20
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:211 Configure Pwpolicy with PasswordCheckSyntax and nsslapd-pwpolicy-local set to off INFO  tests.suites.password.regression_test:regression_test.py:216 Replace userPassword attribute with ZZCNpwtest1ZZ
Passed suites/password/regression_test.py::test_global_vs_local[ZZZCNpwtest1] 0.21
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:211 Configure Pwpolicy with PasswordCheckSyntax and nsslapd-pwpolicy-local set to off INFO  tests.suites.password.regression_test:regression_test.py:216 Replace userPassword attribute with ZZZCNpwtest1
Passed suites/password/regression_test.py::test_global_vs_local[CNpwtest1ZZZ] 0.42
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:211 Configure Pwpolicy with PasswordCheckSyntax and nsslapd-pwpolicy-local set to off INFO  tests.suites.password.regression_test:regression_test.py:216 Replace userPassword attribute with CNpwtest1ZZZ
Passed suites/password/regression_test.py::test_global_vs_local[ZZZCNpwtest1ZZZ] 0.20
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:211 Configure Pwpolicy with PasswordCheckSyntax and nsslapd-pwpolicy-local set to off INFO  tests.suites.password.regression_test:regression_test.py:216 Replace userPassword attribute with ZZZCNpwtest1ZZZ
Passed suites/password/regression_test.py::test_global_vs_local[ZZZZZZCNpwtest1ZZZZZZZZ] 0.42
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:211 Configure Pwpolicy with PasswordCheckSyntax and nsslapd-pwpolicy-local set to off INFO  tests.suites.password.regression_test:regression_test.py:216 Replace userPassword attribute with ZZZZZZCNpwtest1ZZZZZZZZ
Passed suites/plugins/acceptance_test.py::test_acctpolicy 15.51
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38902, 'ldap-secureport': 63602, 'server-id': 'standalone2', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/plugins/acceptance_test.py::test_attruniq 17.66
No log output captured.
Passed suites/plugins/acceptance_test.py::test_automember 25.03
No log output captured.
Passed suites/plugins/acceptance_test.py::test_dna 17.24
No log output captured.
Passed suites/plugins/acceptance_test.py::test_linkedattrs 25.26
No log output captured.
Passed suites/plugins/acceptance_test.py::test_memberof 34.82
No log output captured.
Passed suites/plugins/acceptance_test.py::test_mep 17.78
No log output captured.
Passed suites/plugins/acceptance_test.py::test_passthru 22.06
No log output captured.
Passed suites/plugins/acceptance_test.py::test_referint 13.49
No log output captured.
Passed suites/plugins/acceptance_test.py::test_retrocl 22.59
No log output captured.
Passed suites/plugins/acceptance_test.py::test_rootdn 33.90
No log output captured.
Passed suites/plugins/accpol_test.py::test_glact_inact 27.23
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.utils:accpol_test.py:35 Configuring Global account policy plugin, pwpolicy attributes and restarting the server
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:accpol_test.py:348 AccountInactivityLimit set to 12. Account will be inactivated if not accessed in 12 secs INFO  lib389.utils:accpol_test.py:271 add_users: Pass all of these as parameters suffix, subtree, userid and nousrs INFO  lib389.utils:accpol_test.py:351 Sleep for 10 secs to check if account is not inactivated, expected value 0 INFO  lib389.utils:accpol_test.py:353 Account should not be inactivated since AccountInactivityLimit not exceeded INFO  lib389.utils:accpol_test.py:356 Sleep for 3 more secs to check if account is inactivated INFO  lib389.utils:accpol_test.py:360 Sleep +10 secs to check if account glinactusr3 is inactivated INFO  lib389.utils:accpol_test.py:290 del_users: Pass all of these as parameters suffix, subtree, userid and nousrs
Passed suites/plugins/accpol_test.py::test_glremv_lastlogin 19.18
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:accpol_test.py:394 AccountInactivityLimit set to 12. Account will be inactivated if not accessed in 12 secs INFO  lib389.utils:accpol_test.py:271 add_users: Pass all of these as parameters suffix, subtree, userid and nousrs INFO  lib389.utils:accpol_test.py:396 Sleep for 6 secs to check if account is not inactivated, expected value 0 INFO  lib389.utils:accpol_test.py:398 Account should not be inactivated since AccountInactivityLimit not exceeded INFO  lib389.utils:accpol_test.py:254 Delete lastLoginTime/createTimeStamp/ModifyTimeStamp attribute from user account INFO  lib389.utils:accpol_test.py:401 Sleep for 7 more secs to check if account is inactivated INFO  lib389.utils:accpol_test.py:219 Enable account by replacing lastLoginTime/createTimeStamp/ModifyTimeStamp attribute INFO  lib389.utils:accpol_test.py:405 Check if account is activated, expected 0 INFO  lib389.utils:accpol_test.py:290 del_users: Pass all of these as parameters suffix, subtree, userid and nousrs
Passed suites/plugins/accpol_test.py::test_glact_login 23.29
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:accpol_test.py:434 AccountInactivityLimit set to 12. Account will be inactivated if not accessed in 12 secs INFO  lib389.utils:accpol_test.py:271 add_users: Pass all of these as parameters suffix, subtree, userid and nousrs INFO  lib389.utils:accpol_test.py:436 Sleep for 13 secs to check if account is inactivated, expected error 19 INFO  lib389.utils:accpol_test.py:219 Enable account by replacing lastLoginTime/createTimeStamp/ModifyTimeStamp attribute INFO  lib389.utils:accpol_test.py:440 Check if account is activated, expected 0 INFO  lib389.utils:accpol_test.py:290 del_users: Pass all of these as parameters suffix, subtree, userid and nousrs
Passed suites/plugins/accpol_test.py::test_glinact_limit 122.50
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:accpol_test.py:492 AccountInactivityLimit set to 12. Account will be inactivated if not accessed in 12 secs INFO  lib389.utils:accpol_test.py:271 add_users: Pass all of these as parameters suffix, subtree, userid and nousrs INFO  lib389.utils:accpol_test.py:494 Sleep for 9 secs to check if account is not inactivated, expected 0 INFO  lib389.utils:accpol_test.py:189 Modify attribute value for a given DN INFO  lib389.utils:accpol_test.py:189 Modify attribute value for a given DN INFO  lib389.utils:accpol_test.py:271 add_users: Pass all of these as parameters suffix, subtree, userid and nousrs INFO  lib389.utils:accpol_test.py:189 Modify attribute value for a given DN INFO  lib389.utils:accpol_test.py:271 add_users: Pass all of these as parameters suffix, subtree, userid and nousrs INFO  lib389.utils:accpol_test.py:516 Check if account is activated, expected 0 INFO  lib389.utils:accpol_test.py:219 Enable account by replacing lastLoginTime/createTimeStamp/ModifyTimeStamp attribute INFO  lib389.utils:accpol_test.py:189 Modify attribute value for a given DN INFO  lib389.utils:accpol_test.py:290 del_users: Pass all of these as parameters suffix, subtree, userid and nousrs
Passed suites/plugins/accpol_test.py::test_glnologin_attr 84.27
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:accpol_test.py:575 AccountInactivityLimit set to 12. Account will be inactivated if not accessed in 12 secs INFO  lib389.utils:accpol_test.py:576 Set attribute StateAttrName to createTimestamp, loginTime attr wont be considered INFO  lib389.utils:accpol_test.py:189 Modify attribute value for a given DN INFO  lib389.utils:accpol_test.py:271 add_users: Pass all of these as parameters suffix, subtree, userid and nousrs INFO  lib389.utils:accpol_test.py:580 Sleep for 9 secs to check if account is not inactivated, expected 0 INFO  lib389.utils:accpol_test.py:189 Modify attribute value for a given DN INFO  lib389.utils:accpol_test.py:189 Modify attribute value for a given DN INFO  lib389.utils:accpol_test.py:271 add_users: Pass all of these as parameters suffix, subtree, userid and nousrs INFO  lib389.utils:accpol_test.py:189 Modify attribute value for a given DN INFO  lib389.utils:accpol_test.py:271 add_users: Pass all of these as parameters suffix, subtree, userid and nousrs INFO  lib389.utils:accpol_test.py:189 Modify attribute value for a given DN INFO  lib389.utils:accpol_test.py:605 Set attribute StateAttrName to lastLoginTime, the default INFO  lib389.utils:accpol_test.py:189 Modify attribute value for a given DN INFO  lib389.utils:accpol_test.py:219 Enable account by replacing lastLoginTime/createTimeStamp/ModifyTimeStamp attribute INFO  lib389.utils:accpol_test.py:609 Check if account is activated, expected 0 INFO  lib389.utils:accpol_test.py:290 del_users: Pass all of these as parameters suffix, subtree, userid and nousrs
Passed suites/plugins/accpol_test.py::test_glnoalt_stattr 56.51
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:accpol_test.py:640 Set attribute altStateAttrName to 1.1 INFO  lib389.utils:accpol_test.py:189 Modify attribute value for a given DN INFO  lib389.utils:accpol_test.py:271 add_users: Pass all of these as parameters suffix, subtree, userid and nousrs INFO  lib389.utils:accpol_test.py:644 Sleep for 13 secs to check if account is not inactivated, expected 0 INFO  lib389.utils:accpol_test.py:647 lastLoginTime attribute is added from the above ldap bind by userdn INFO  lib389.utils:accpol_test.py:254 Delete lastLoginTime/createTimeStamp/ModifyTimeStamp attribute from user account INFO  lib389.utils:accpol_test.py:189 Modify attribute value for a given DN INFO  lib389.utils:accpol_test.py:219 Enable account by replacing lastLoginTime/createTimeStamp/ModifyTimeStamp attribute INFO  lib389.utils:accpol_test.py:290 del_users: Pass all of these as parameters suffix, subtree, userid and nousrs
Passed suites/plugins/accpol_test.py::test_glattr_modtime 45.27
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:accpol_test.py:687 Set attribute altStateAttrName to modifyTimestamp INFO  lib389.utils:accpol_test.py:189 Modify attribute value for a given DN INFO  lib389.utils:accpol_test.py:271 add_users: Pass all of these as parameters suffix, subtree, userid and nousrs INFO  lib389.utils:accpol_test.py:691 Sleep for 13 secs to check if account is inactivated, expected 0 INFO  lib389.utils:accpol_test.py:202 Check ModifyTimeStamp attribute present for user INFO  lib389.utils:accpol_test.py:237 Enable account by replacing cn attribute value, value of modifyTimeStamp changed INFO  lib389.utils:accpol_test.py:189 Modify attribute value for a given DN INFO  lib389.utils:accpol_test.py:254 Delete lastLoginTime/createTimeStamp/ModifyTimeStamp attribute from user account INFO  lib389.utils:accpol_test.py:219 Enable account by replacing lastLoginTime/createTimeStamp/ModifyTimeStamp attribute INFO  lib389.utils:accpol_test.py:290 del_users: Pass all of these as parameters suffix, subtree, userid and nousrs
Passed suites/plugins/accpol_test.py::test_glnoalt_nologin 50.61
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:accpol_test.py:742 Set attribute altStateAttrName to 1.1 INFO  lib389.utils:accpol_test.py:189 Modify attribute value for a given DN INFO  lib389.utils:accpol_test.py:744 Set attribute alwaysrecordlogin to No INFO  lib389.utils:accpol_test.py:189 Modify attribute value for a given DN INFO  lib389.utils:accpol_test.py:271 add_users: Pass all of these as parameters suffix, subtree, userid and nousrs INFO  lib389.utils:accpol_test.py:748 Sleep for 13 secs to check if account is not inactivated, expected 0 INFO  lib389.utils:accpol_test.py:753 Set attribute altStateAttrName to createTimestamp INFO  lib389.utils:accpol_test.py:189 Modify attribute value for a given DN INFO  lib389.utils:accpol_test.py:758 Reset the default attribute values INFO  lib389.utils:accpol_test.py:189 Modify attribute value for a given DN INFO  lib389.utils:accpol_test.py:219 Enable account by replacing lastLoginTime/createTimeStamp/ModifyTimeStamp attribute INFO  lib389.utils:accpol_test.py:290 del_users: Pass all of these as parameters suffix, subtree, userid and nousrs
Passed suites/plugins/accpol_test.py::test_glinact_nsact 23.18
------------------------------Captured stderr call------------------------------
Additional information: Invalid attribute in filter - results may not be complete. Additional information: Invalid attribute in filter - results may not be complete. Additional information: Invalid attribute in filter - results may not be complete. Additional information: Invalid attribute in filter - results may not be complete.
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:accpol_test.py:798 AccountInactivityLimit set to 12. Account will be inactivated if not accessed in 12 secs INFO  lib389.utils:accpol_test.py:271 add_users: Pass all of these as parameters suffix, subtree, userid and nousrs INFO  lib389.utils:accpol_test.py:800 Sleep for 3 secs to check if account is not inactivated, expected value 0 INFO  lib389.utils:accpol_test.py:162 Account activate/in-activate/status using ns-activate/inactivate/accountstatus.pl INFO  lib389.utils:accpol_test.py:166 Running ns-activate.pl for user uid=nsactusr1,ou=groups,dc=example,dc=com INFO  lib389.utils:accpol_test.py:180 output: b'uid=nsactusr1,ou=groups,dc=example,dc=com already activated.\n' INFO  lib389.utils:accpol_test.py:803 Sleep for 10 secs to check if account is inactivated, expected value 19 INFO  lib389.utils:accpol_test.py:162 Account activate/in-activate/status using ns-activate/inactivate/accountstatus.pl INFO  lib389.utils:accpol_test.py:166 Running ns-activate.pl for user uid=nsactusr1,ou=groups,dc=example,dc=com INFO  lib389.utils:accpol_test.py:180 output: b'uid=nsactusr1,ou=groups,dc=example,dc=com already activated.\n' INFO  lib389.utils:accpol_test.py:162 Account activate/in-activate/status using ns-activate/inactivate/accountstatus.pl INFO  lib389.utils:accpol_test.py:166 Running ns-accountstatus.pl for user uid=nsactusr1,ou=groups,dc=example,dc=com INFO  lib389.utils:accpol_test.py:180 output: b'uid=nsactusr1,ou=groups,dc=example,dc=com - inactivated (inactivity limit exceeded).\n' INFO  lib389.utils:accpol_test.py:219 Enable account by replacing lastLoginTime/createTimeStamp/ModifyTimeStamp attribute INFO  lib389.utils:accpol_test.py:162 Account activate/in-activate/status using ns-activate/inactivate/accountstatus.pl INFO  lib389.utils:accpol_test.py:166 Running ns-accountstatus.pl for user uid=nsactusr1,ou=groups,dc=example,dc=com INFO  lib389.utils:accpol_test.py:180 output: b'uid=nsactusr1,ou=groups,dc=example,dc=com - activated.\n' INFO  lib389.utils:accpol_test.py:290 del_users: Pass all of these as parameters suffix, subtree, userid and nousrs
Passed suites/plugins/accpol_test.py::test_glinact_acclock 40.30
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:accpol_test.py:843 AccountInactivityLimit set to 12. Account will be inactivated if not accessed in 12 secs INFO  lib389.utils:accpol_test.py:271 add_users: Pass all of these as parameters suffix, subtree, userid and nousrs INFO  lib389.utils:accpol_test.py:845 Sleep for 3 secs and try invalid binds to lockout the user INFO  lib389.utils:accpol_test.py:118 Lockout user account by attempting invalid password binds INFO  lib389.utils:accpol_test.py:849 Sleep for 10 secs to check if account is inactivated, expected value 19 INFO  lib389.utils:accpol_test.py:853 Add lastLoginTime to activate the user account INFO  lib389.utils:accpol_test.py:219 Enable account by replacing lastLoginTime/createTimeStamp/ModifyTimeStamp attribute INFO  lib389.utils:accpol_test.py:857 Checking if account is unlocked after passwordlockoutduration, but inactivated after accountInactivityLimit INFO  lib389.utils:accpol_test.py:118 Lockout user account by attempting invalid password binds INFO  lib389.utils:accpol_test.py:861 Account is expected to be unlocked after 5 secs of passwordlockoutduration INFO  lib389.utils:accpol_test.py:865 Sleep 13s and check if account inactivated based on accountInactivityLimit, expected 19 INFO  lib389.utils:accpol_test.py:290 del_users: Pass all of these as parameters suffix, subtree, userid and nousrs
Passed suites/plugins/accpol_test.py::test_glnact_pwexp 49.34
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:accpol_test.py:908 AccountInactivityLimit set to 12. Account will be inactivated if not accessed in 12 secs INFO  lib389.utils:accpol_test.py:909 Passwordmaxage is set to 9. Password will expire in 9 secs INFO  lib389.utils:accpol_test.py:271 add_users: Pass all of these as parameters suffix, subtree, userid and nousrs INFO  lib389.utils:accpol_test.py:912 Sleep for 9 secs and check if password expired INFO  lib389.utils:accpol_test.py:918 Add lastLoginTime to activate the user account INFO  lib389.utils:accpol_test.py:219 Enable account by replacing lastLoginTime/createTimeStamp/ModifyTimeStamp attribute INFO  lib389.utils:accpol_test.py:141 Reset user password for user-uid=pwexpusr1,ou=groups,dc=example,dc=com INFO  lib389.utils:accpol_test.py:141 Reset user password for user-uid=pwexpusr1,ou=groups,dc=example,dc=com INFO  lib389.utils:accpol_test.py:929 Sleep for 4 secs and check if account is now inactivated, expected error 19 INFO  lib389.utils:accpol_test.py:141 Reset user password for user-uid=pwexpusr1,ou=groups,dc=example,dc=com INFO  lib389.utils:accpol_test.py:219 Enable account by replacing lastLoginTime/createTimeStamp/ModifyTimeStamp attribute INFO  lib389.utils:accpol_test.py:141 Reset user password for user-uid=pwexpusr1,ou=groups,dc=example,dc=com INFO  lib389.utils:accpol_test.py:290 del_users: Pass all of these as parameters suffix, subtree, userid and nousrs
Passed suites/plugins/accpol_test.py::test_locact_inact 32.06
-------------------------------Captured log setup-------------------------------
INFO  lib389.utils:accpol_test.py:80 Adding Local account policy plugin configuration entries
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:accpol_test.py:980 AccountInactivityLimit set to 10. Account will be inactivated if not accessed in 10 secs INFO  lib389.utils:accpol_test.py:271 add_users: Pass all of these as parameters suffix, subtree, userid and nousrs INFO  lib389.utils:accpol_test.py:982 Sleep for 9 secs to check if account is not inactivated, expected value 0 INFO  lib389.utils:accpol_test.py:984 Account should not be inactivated since AccountInactivityLimit not exceeded INFO  lib389.utils:accpol_test.py:986 Sleep for 2 more secs to check if account is inactivated INFO  lib389.utils:accpol_test.py:989 Sleep +9 secs to check if account inactusr3 is inactivated INFO  lib389.utils:accpol_test.py:992 Add lastLoginTime attribute to all users and check if its activated INFO  lib389.utils:accpol_test.py:219 Enable account by replacing lastLoginTime/createTimeStamp/ModifyTimeStamp attribute INFO  lib389.utils:accpol_test.py:290 del_users: Pass all of these as parameters suffix, subtree, userid and nousrs
Passed suites/plugins/accpol_test.py::test_locinact_modrdn 27.13
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:accpol_test.py:1026 Account should not be inactivated since the subtree is not configured INFO  lib389.utils:accpol_test.py:271 add_users: Pass all of these as parameters suffix, subtree, userid and nousrs INFO  lib389.utils:accpol_test.py:1028 Sleep for 11 secs to check if account is not inactivated, expected value 0 INFO  lib389.utils:accpol_test.py:1031 Moving users from ou=groups to ou=people subtree INFO  lib389.utils:accpol_test.py:1039 Then wait for 11 secs and check if entries are inactivated INFO  lib389.utils:accpol_test.py:219 Enable account by replacing lastLoginTime/createTimeStamp/ModifyTimeStamp attribute INFO  lib389.utils:accpol_test.py:290 del_users: Pass all of these as parameters suffix, subtree, userid and nousrs
Passed suites/plugins/accpol_test.py::test_locact_modrdn 15.11
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:accpol_test.py:1071 Account should be inactivated since the subtree is configured INFO  lib389.utils:accpol_test.py:271 add_users: Pass all of these as parameters suffix, subtree, userid and nousrs INFO  lib389.utils:accpol_test.py:1073 Sleep for 11 secs to check if account is inactivated, expected value 19 INFO  lib389.utils:accpol_test.py:1076 Moving users from ou=people to ou=groups subtree INFO  lib389.utils:accpol_test.py:1083 Sleep for +2 secs and check users from both ou=people and ou=groups subtree INFO  lib389.utils:accpol_test.py:290 del_users: Pass all of these as parameters suffix, subtree, userid and nousrs
Passed suites/plugins/attr_nsslapd-pluginarg_test.py::test_duplicate_values 3.49
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created. INFO  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:31 Ticket 47431 - 0: Enable 7bit plugin...
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:56 Ticket 47431 - 1: Check 26 duplicate values are treated as one... DEBUG  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:59 modify_s cn=7-bit check,cn=plugins,cn=config DEBUG  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:83 line: [20/Oct/2020:20:38:13.977755329 -0400] - WARN - str2entry_dupcheck - 26 duplicate values for attribute type nsslapd-pluginarg2 detected in entry cn=7-bit check,cn=plugins,cn=config. Extra values ignored. INFO  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:84 Expected error "str2entry_dupcheck.* duplicate values for attribute type nsslapd-pluginarg2 detected in entry cn=7-bit check,cn=plugins,cn=config." logged in /var/log/dirsrv/slapd-standalone1/errors INFO  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:86 Ticket 47431 - 1: done
Passed suites/plugins/attr_nsslapd-pluginarg_test.py::test_multiple_value 5.60
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:111 Ticket 47431 - 2: Check two values belonging to one arg is fixed... DEBUG  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:131 line - [20/Oct/2020:20:38:18.584498334 -0400] - DEBUG - NS7bitAttr - NS7bitAttr_Init - 0: uid DEBUG  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:132 ATTRS[0] uid DEBUG  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:136 uid was logged DEBUG  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:131 line - [20/Oct/2020:20:38:18.590075166 -0400] - DEBUG - NS7bitAttr - NS7bitAttr_Init - 1: mail DEBUG  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:132 ATTRS[1] mail DEBUG  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:136 mail was logged DEBUG  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:131 line - [20/Oct/2020:20:38:18.593044030 -0400] - DEBUG - NS7bitAttr - NS7bitAttr_Init - 2: userpassword DEBUG  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:132 ATTRS[2] userpassword DEBUG  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:136 userpassword was logged DEBUG  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:131 line - [20/Oct/2020:20:38:18.595954808 -0400] - DEBUG - NS7bitAttr - NS7bitAttr_Init - 3: , DEBUG  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:132 ATTRS[3] , DEBUG  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:136 , was logged DEBUG  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:131 line - [20/Oct/2020:20:38:18.598976159 -0400] - DEBUG - NS7bitAttr - NS7bitAttr_Init - 4: dc=example,dc=com DEBUG  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:132 ATTRS[4] dc=example,dc=com DEBUG  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:136 dc=example,dc=com was logged INFO  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:142 Ticket 47431 - 2: done
Passed suites/plugins/attr_nsslapd-pluginarg_test.py::test_missing_args 4.24
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:170 Ticket 47431 - 3: Check missing args are fixed... DEBUG  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:197 uid was logged DEBUG  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:197 mail was logged DEBUG  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:197 userpassword was logged DEBUG  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:197 , was logged DEBUG  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:197 dc=example,dc=com was logged INFO  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:203 Ticket 47431 - 3: done INFO  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:204 Test complete
Passed suites/plugins/cos_test.py::test_cos_operational_default 4.54
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
CRITICAL lib389.utils:cos_test.py:27 Adding user (uid=user_0,ou=people,dc=example,dc=com): INFO  lib389.utils:cos_test.py:153 Returned telephonenumber (exp. real): b'1234 is real' INFO  lib389.utils:cos_test.py:154 Returned telephonenumber: 8 INFO  lib389.utils:cos_test.py:160 Returned l (exp. real): b'here is real' INFO  lib389.utils:cos_test.py:161 Returned l: 8 INFO  lib389.utils:cos_test.py:170 Returned seealso (exp. virtual): b'dc=virtual,dc=example,dc=com' INFO  lib389.utils:cos_test.py:171 Returned seealso: 3 INFO  lib389.utils:cos_test.py:180 Returned description (exp. virtual): b'desc is virtual' INFO  lib389.utils:cos_test.py:181 Returned description: 8 INFO  lib389.utils:cos_test.py:191 Returned title (exp. real): b'title is real' INFO  lib389.utils:cos_test.py:212 Returned title(exp. virt): b'title is virtual 1' INFO  lib389.utils:cos_test.py:212 Returned title(exp. virt): b'title is virtual 0'
Passed suites/plugins/deref_aci_test.py::test_deref_and_access_control 0.66
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.deref_aci_test:deref_aci_test.py:133 Check, that the dereference search result does not have userpassword
Passed suites/plugins/dna_test.py::test_dnatype_only_valid 4.27
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/plugins/entryusn_test.py::test_entryusn_no_duplicates 5.25
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/plugins/entryusn_test.py::test_entryusn_is_same_after_failure 5.13
No log output captured.
Passed suites/plugins/managed_entry_test.py::test_binddn_tracking 2.33
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/plugins/managed_entry_test.py::test_mentry01 10.52
No log output captured.
Passed suites/plugins/managed_entry_test.py::test_managed_entry_removal 5.51
No log output captured.
Passed suites/plugins/memberof_test.py::test_betxnpostoperation_replace 4.52
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/plugins/memberof_test.py::test_memberofgroupattr_add 0.01
No log output captured.
Passed suites/plugins/memberof_test.py::test_enable 4.37
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.memberof_test:memberof_test.py:181 Enable MemberOf plugin
Passed suites/plugins/memberof_test.py::test_member_add 0.70
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.memberof_test:memberof_test.py:67 Create user uid=user_memofenh1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:67 Create user uid=user_memofenh2,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:83 Create group cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:83 Create group cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:211 Update b'uid=user_memofenh1,ou=people,dc=example,dc=com' is memberof b'cn=group_memofegrp1,ou=groups,dc=example,dc=com' (member) INFO  tests.suites.plugins.memberof_test:memberof_test.py:212 Update b'uid=user_memofenh2,ou=people,dc=example,dc=com' is memberof b'cn=group_memofegrp1,ou=groups,dc=example,dc=com' (uniqueMember) INFO  tests.suites.plugins.memberof_test:memberof_test.py:215 Update b'uid=user_memofenh1,ou=people,dc=example,dc=com' is memberof b'cn=group_memofegrp2,ou=groups,dc=example,dc=com' (member) INFO  tests.suites.plugins.memberof_test:memberof_test.py:216 Update b'uid=user_memofenh2,ou=people,dc=example,dc=com' is memberof b'cn=group_memofegrp2,ou=groups,dc=example,dc=com' (uniqueMember) INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified
Passed suites/plugins/memberof_test.py::test_member_delete_gr1 0.51
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.memberof_test:memberof_test.py:246 Update b'uid=user_memofenh1,ou=people,dc=example,dc=com' is no longer memberof b'cn=group_memofegrp1,ou=groups,dc=example,dc=com' (member) INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified
Passed suites/plugins/memberof_test.py::test_member_delete_gr2 0.01
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.memberof_test:memberof_test.py:278 Update b'uid=user_memofenh1,ou=people,dc=example,dc=com' is no longer memberof b'cn=group_memofegrp1,ou=groups,dc=example,dc=com' (uniqueMember) INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com
Passed suites/plugins/memberof_test.py::test_member_delete_all 0.01
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.memberof_test:memberof_test.py:310 Update b'uid=user_memofenh2,ou=people,dc=example,dc=com' is no longer memberof b'cn=group_memofegrp1,ou=groups,dc=example,dc=com' (uniqueMember) INFO  tests.suites.plugins.memberof_test:memberof_test.py:314 Update b'uid=user_memofenh1,ou=people,dc=example,dc=com' is no longer memberof b'cn=group_memofegrp2,ou=groups,dc=example,dc=com' (member) INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com'
Passed suites/plugins/memberof_test.py::test_member_after_restart 8.24
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.memberof_test:memberof_test.py:349 Update b'uid=user_memofenh1,ou=people,dc=example,dc=com' is memberof b'cn=group_memofegrp1,ou=groups,dc=example,dc=com' (member) INFO  tests.suites.plugins.memberof_test:memberof_test.py:353 Update b'uid=user_memofenh2,ou=people,dc=example,dc=com' is memberof b'cn=group_memofegrp2,ou=groups,dc=example,dc=com' (uniqueMember) INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:364 Remove uniqueMember as a memberofgrpattr INFO  tests.suites.plugins.memberof_test:memberof_test.py:371 Assert that this change of configuration did change the already set values INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified
Passed suites/plugins/memberof_test.py::test_memberofgroupattr_uid 0.00
-------------------------------Captured log call--------------------------------
ERROR  tests.suites.plugins.memberof_test:memberof_test.py:400 Setting 'memberUid' as memberofgroupattr is rejected (expected)
Passed suites/plugins/memberof_test.py::test_member_add_duplicate_usr1 0.04
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:422 Try b'uid=user_memofenh1,ou=people,dc=example,dc=com' is memberof b'cn=group_memofegrp1,ou=groups,dc=example,dc=com' (member) ERROR  tests.suites.plugins.memberof_test:memberof_test.py:429 b'uid=user_memofenh1,ou=people,dc=example,dc=com' already member of b'cn=group_memofegrp1,ou=groups,dc=example,dc=com' --> fail (expected)
Passed suites/plugins/memberof_test.py::test_member_add_duplicate_usr2 0.78
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.memberof_test:memberof_test.py:450 Check initial status INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:460 Try b'uid=user_memofenh2,ou=people,dc=example,dc=com' is memberof b'cn=group_memofegrp2,ou=groups,dc=example,dc=com' (member) ERROR  tests.suites.plugins.memberof_test:memberof_test.py:467 b'uid=user_memofenh2,ou=people,dc=example,dc=com' already member of b'cn=group_memofegrp2,ou=groups,dc=example,dc=com' --> fail (expected) INFO  tests.suites.plugins.memberof_test:memberof_test.py:470 Check final status INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified
Passed suites/plugins/memberof_test.py::test_member_uniquemember_same_user 0.04
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.memberof_test:memberof_test.py:557 Check initial status INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:83 Create group cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:569 Update b'uid=user_memofenh1,ou=people,dc=example,dc=com' is memberof b'cn=group_memofegrp3,ou=groups,dc=example,dc=com' (member) INFO  tests.suites.plugins.memberof_test:memberof_test.py:570 Update b'uid=user_memofenh1,ou=people,dc=example,dc=com' is memberof b'cn=group_memofegrp3,ou=groups,dc=example,dc=com' (uniqueMember) INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:582 Update b'uid=user_memofenh1,ou=people,dc=example,dc=com' is not memberof b'cn=group_memofegrp3,ou=groups,dc=example,dc=com' (member) INFO  tests.suites.plugins.memberof_test:memberof_test.py:586 Update b'uid=user_memofenh2,ou=people,dc=example,dc=com' is memberof b'cn=group_memofegrp3,ou=groups,dc=example,dc=com' (member) INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:613 Checking final status INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified
Passed suites/plugins/memberof_test.py::test_member_not_exists 0.03
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.memberof_test:memberof_test.py:671 Checking Initial status INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:83 Create group cn=group_memofegrp015,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:691 Update b'uid=user_dummy1,ou=people,dc=example,dc=com' is memberof b'cn=group_memofegrp015,ou=groups,dc=example,dc=com' (member) INFO  tests.suites.plugins.memberof_test:memberof_test.py:692 Update b'uid=user_dummy2,ou=people,dc=example,dc=com' is memberof b'cn=group_memofegrp015,ou=groups,dc=example,dc=com' (uniqueMember) INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com
Passed suites/plugins/memberof_test.py::test_member_not_exists_complex 0.04
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:83 Create group cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:806 Update b'uid=user_memofenh1,ou=people,dc=example,dc=com' is memberof b'cn=group_memofegrp016,ou=groups,dc=example,dc=com' (member) INFO  tests.suites.plugins.memberof_test:memberof_test.py:807 Update b'uid=user_memofenh1,ou=people,dc=example,dc=com' is memberof b'cn=group_memofegrp016,ou=groups,dc=example,dc=com' (uniqueMember) INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:835 Update b'uid=user_dummy1,ou=people,dc=example,dc=com' is memberof b'cn=group_memofegrp016,ou=groups,dc=example,dc=com' (member) INFO  tests.suites.plugins.memberof_test:memberof_test.py:845 Update b'uid=user_dummy1,ou=people,dc=example,dc=com' is memberof b'cn=group_memofegrp016,ou=groups,dc=example,dc=com' (uniqueMember) INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com
Passed suites/plugins/memberof_test.py::test_complex_group_scenario_1 0.31
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:67 Create user uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:67 Create user uid=user_memofuser2,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:67 Create user uid=user_memofuser3,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:83 Create group cn=group_memofegrp017,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:1011 Update b'uid=user_memofuser1,ou=people,dc=example,dc=com' is memberof b'cn=group_memofegrp017,ou=groups,dc=example,dc=com' (member) INFO  tests.suites.plugins.memberof_test:memberof_test.py:1012 Update b'uid=user_memofuser2,ou=people,dc=example,dc=com' is memberof b'cn=group_memofegrp017,ou=groups,dc=example,dc=com' (uniqueMember) INFO  tests.suites.plugins.memberof_test:memberof_test.py:1013 Update b'uid=user_memofuser3,ou=people,dc=example,dc=com' is memberof b'cn=group_memofegrp017,ou=groups,dc=example,dc=com' (memberuid) INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com'
Passed suites/plugins/memberof_test.py::test_complex_group_scenario_2 0.33
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:83 Create group cn=group_memofegrp018,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:1261 Update b'uid=user_memofuser1,ou=people,dc=example,dc=com' is memberof b'cn=group_memofegrp017,ou=groups,dc=example,dc=com' (member) INFO  tests.suites.plugins.memberof_test:memberof_test.py:1262 Update b'uid=user_memofuser1,ou=people,dc=example,dc=com' is memberof b'cn=group_memofegrp017,ou=groups,dc=example,dc=com' (uniqueMember) INFO  tests.suites.plugins.memberof_test:memberof_test.py:1263 Update b'uid=user_memofuser1,ou=people,dc=example,dc=com' is memberof b'cn=group_memofegrp017,ou=groups,dc=example,dc=com' (memberuid) INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp018,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp018,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp018,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp018,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp018,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp018,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:1283 Update b'uid=user_memofuser1,ou=people,dc=example,dc=com' is no longer memberof b'cn=group_memofegrp018,ou=groups,dc=example,dc=com' (member) INFO  tests.suites.plugins.memberof_test:memberof_test.py:1284 Update b'uid=user_memofuser1,ou=people,dc=example,dc=com' is no longer memberof b'cn=group_memofegrp018,ou=groups,dc=example,dc=com' (uniqueMember) INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com
Passed suites/plugins/memberof_test.py::test_complex_group_scenario_3 0.15
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:67 Create user uid=user_memofuser2,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:67 Create user uid=user_memofuser3,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:83 Create group cn=group_memofegrp019_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:83 Create group cn=group_memofegrp019_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:83 Create group cn=group_memofegrp019_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp019_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp019_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp019_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp019_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp019_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp019_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp019_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp019_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp019_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp019_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp019_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp019_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp019_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp019_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp019_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp019_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp019_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp019_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp019_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp019_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp019_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp019_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp019_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp019_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp019_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp019_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp019_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com
Passed suites/plugins/memberof_test.py::test_complex_group_scenario_4 0.15
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:67 Create user uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:83 Create group cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:83 Create group cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:83 Create group cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:83 Create group cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:83 Create group cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified
Passed suites/plugins/memberof_test.py::test_complex_group_scenario_5 0.13
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:67 Create user uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:67 Create user uid=user_memofuser2,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:67 Create user uid=user_memofuser3,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:67 Create user uid=user_memofuser4,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:83 Create group cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:83 Create group cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:83 Create group cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:83 Create group cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:83 Create group cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser2,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser3,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser4,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com
Passed suites/plugins/memberof_test.py::test_complex_group_scenario_6 5.25
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser2,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser3,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser4,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser2,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser2,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser3,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser3,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser4,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser4,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com
Passed suites/plugins/memberof_test.py::test_complex_group_scenario_7 0.11
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser2,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser2,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser3,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser3,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser4,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser4,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser2,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser2,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser3,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser3,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser4,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser4,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser2,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser2,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser3,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser3,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser4,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser4,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified
Passed suites/plugins/memberof_test.py::test_complex_group_scenario_8 0.06
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser2,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser2,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser3,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser3,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser4,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser4,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser2,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser2,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser3,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser3,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser4,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser4,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser2,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser2,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser3,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser3,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser4,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser4,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser2,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser2,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser3,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser3,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser4,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser4,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified
Passed suites/plugins/memberof_test.py::test_complex_group_scenario_9 0.10
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser2,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser2,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser3,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser3,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser4,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser4,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser2,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser2,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser3,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser3,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser4,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser4,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=Groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com'
Passed suites/plugins/memberof_test.py::test_memberof_auto_add_oc 0.15
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from uid=user1,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:2753 Correctly rejected invalid objectclass INFO  tests.suites.plugins.memberof_test:memberof_test.py:2823 Test complete.
Passed suites/plugins/pluginpath_validation_test.py::test_pluginpath_validation 0.27
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.pluginpath_validation_test:pluginpath_validation_test.py:103 Test complete
Passed suites/plugins/referint_test.py::test_referential_false_failure 7.03
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo0,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo1,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo2,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo3,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo4,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo5,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo6,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo7,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo8,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo9,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo10,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo11,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo12,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo13,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo14,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo15,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo16,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo17,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo18,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo19,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo20,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo21,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo22,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo23,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo24,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo25,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo26,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo27,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo28,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo29,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo30,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo31,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo32,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo33,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo34,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo35,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo36,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo37,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo38,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo39,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo40,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo41,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo42,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo43,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo44,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo45,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo46,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo47,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo48,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo49,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo50,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo51,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo52,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo53,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo54,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo55,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo56,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo57,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo58,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo59,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo60,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo61,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo62,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo63,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo64,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo65,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo66,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo67,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo68,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo69,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo70,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo71,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo72,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo73,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo74,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo75,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo76,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo77,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo78,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo79,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo80,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo81,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo82,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo83,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo84,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo85,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo86,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo87,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo88,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo89,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo90,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo91,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo92,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo93,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo94,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo95,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo96,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo97,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo98,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo99,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo100,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo101,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo102,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo103,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo104,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo105,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo106,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo107,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo108,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo109,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo110,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo111,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo112,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo113,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo114,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo115,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo116,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo117,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo118,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo119,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo120,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo121,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo122,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo123,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo124,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo125,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo126,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo127,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo128,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo129,dc=example,dc=com):
Passed suites/plugins/rootdn_plugin_test.py::test_rootdn_access_specific_time 0.04
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created. INFO  tests.suites.plugins.rootdn_plugin_test:rootdn_plugin_test.py:60 Initializing root DN test suite... INFO  tests.suites.plugins.rootdn_plugin_test:rootdn_plugin_test.py:70 test_rootdn_init: Initialized root DN test suite. INFO  tests.suites.plugins.rootdn_plugin_test:rootdn_plugin_test.py:36 Cleaning up the config area
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.rootdn_plugin_test:rootdn_plugin_test.py:102 Running test_rootdn_access_specific_time...
Passed suites/plugins/rootdn_plugin_test.py::test_rootdn_access_day_of_week 2.56
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.plugins.rootdn_plugin_test:rootdn_plugin_test.py:36 Cleaning up the config area
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.rootdn_plugin_test:rootdn_plugin_test.py:185 Running test_rootdn_access_day_of_week... INFO  tests.suites.plugins.rootdn_plugin_test:rootdn_plugin_test.py:202 Today: Wed INFO  tests.suites.plugins.rootdn_plugin_test:rootdn_plugin_test.py:203 Allowed days: Wed,Thu INFO  tests.suites.plugins.rootdn_plugin_test:rootdn_plugin_test.py:204 Deny days: Thu,Fri
Passed suites/plugins/rootdn_plugin_test.py::test_rootdn_access_denied_ip 2.91
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.plugins.rootdn_plugin_test:rootdn_plugin_test.py:36 Cleaning up the config area
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.rootdn_plugin_test:rootdn_plugin_test.py:268 Running test_rootdn_access_denied_ip...
Passed suites/plugins/rootdn_plugin_test.py::test_rootdn_access_denied_host 2.56
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.plugins.rootdn_plugin_test:rootdn_plugin_test.py:36 Cleaning up the config area
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.rootdn_plugin_test:rootdn_plugin_test.py:333 Running test_rootdn_access_denied_host...
Passed suites/plugins/rootdn_plugin_test.py::test_rootdn_access_allowed_ip 2.58
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.plugins.rootdn_plugin_test:rootdn_plugin_test.py:36 Cleaning up the config area
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.rootdn_plugin_test:rootdn_plugin_test.py:400 Running test_rootdn_access_allowed_ip...
Passed suites/plugins/rootdn_plugin_test.py::test_rootdn_access_allowed_host 2.57
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.plugins.rootdn_plugin_test:rootdn_plugin_test.py:36 Cleaning up the config area
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.rootdn_plugin_test:rootdn_plugin_test.py:466 Running test_rootdn_access_allowed_host...
Passed suites/plugins/rootdn_plugin_test.py::test_rootdn_config_validate 0.03
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.plugins.rootdn_plugin_test:rootdn_plugin_test.py:36 Cleaning up the config area
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.rootdn_plugin_test:rootdn_plugin_test.py:575 Add just "rootdn-open-time"
Passed suites/plugins/rootdn_plugin_test.py::test_rootdn_access_denied_ip_wildcard 2.55
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.plugins.rootdn_plugin_test:rootdn_plugin_test.py:36 Cleaning up the config area
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.rootdn_plugin_test:rootdn_plugin_test.py:666 Running test_rootdn_access_denied_ip_wildcard...
Passed suites/plugins/rootdn_plugin_test.py::test_rootdn_access_allowed_ip_wildcard 3.06
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.plugins.rootdn_plugin_test:rootdn_plugin_test.py:36 Cleaning up the config area
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.rootdn_plugin_test:rootdn_plugin_test.py:712 Running test_rootdn_access_allowed_ip...
Passed suites/psearch/psearch_test.py::test_psearch 2.02
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  lib389:psearch_test.py:30 dc=example,dc=com has changed! INFO  lib389:psearch_test.py:30 cn=Directory Administrators,dc=example,dc=com has changed! INFO  lib389:psearch_test.py:30 ou=Groups,dc=example,dc=com has changed! INFO  lib389:psearch_test.py:30 ou=People,dc=example,dc=com has changed! INFO  lib389:psearch_test.py:30 ou=Special Users,dc=example,dc=com has changed! INFO  lib389:psearch_test.py:30 cn=Accounting Managers,ou=Groups,dc=example,dc=com has changed! INFO  lib389:psearch_test.py:30 cn=HR Managers,ou=Groups,dc=example,dc=com has changed! INFO  lib389:psearch_test.py:30 cn=QA Managers,ou=Groups,dc=example,dc=com has changed! INFO  lib389:psearch_test.py:30 cn=PD Managers,ou=Groups,dc=example,dc=com has changed! INFO  lib389:psearch_test.py:34 No more results INFO  lib389:psearch_test.py:30 cn=group1,ou=Groups,dc=example,dc=com has changed! INFO  lib389:psearch_test.py:34 No more results
Passed suites/pwp_storage/storage_test.py::test_check_password_scheme[CRYPT] 0.30
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/pwp_storage/storage_test.py::test_check_password_scheme[SHA] 0.27
No log output captured.
Passed suites/pwp_storage/storage_test.py::test_check_password_scheme[SSHA] 0.02
No log output captured.
Passed suites/pwp_storage/storage_test.py::test_check_password_scheme[SHA256] 0.03
No log output captured.
Passed suites/pwp_storage/storage_test.py::test_check_password_scheme[SSHA256] 0.02
No log output captured.
Passed suites/pwp_storage/storage_test.py::test_check_password_scheme[SHA384] 0.02
No log output captured.
Passed suites/pwp_storage/storage_test.py::test_check_password_scheme[SSHA384] 0.03
No log output captured.
Passed suites/pwp_storage/storage_test.py::test_check_password_scheme[SHA512] 0.03
No log output captured.
Passed suites/pwp_storage/storage_test.py::test_check_password_scheme[SSHA512] 0.03
No log output captured.
Passed suites/pwp_storage/storage_test.py::test_check_password_scheme[MD5] 0.03
No log output captured.
Passed suites/pwp_storage/storage_test.py::test_check_password_scheme[PBKDF2_SHA256] 0.07
No log output captured.
Passed suites/pwp_storage/storage_test.py::test_clear_scheme 0.02
No log output captured.
Passed suites/pwp_storage/storage_test.py::test_check_two_scheme 3.69
No log output captured.
Passed suites/pwp_storage/storage_test.py::test_check_pbkdf2_sha256 4.24
No log output captured.
Passed suites/pwp_storage/storage_test.py::test_check_ssha512 5.44
No log output captured.
Passed suites/replication/regression_test.py::test_special_symbol_replica_agreement 1.03
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38902, 'ldap-secureport': 63602, 'server-id': 'standalone2', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  lib389:backend.py:80 List backend with suffix=dc=example,dc=com INFO  lib389:__init__.py:1772 Found entry dn: cn=replication manager,cn=config cn: bind dn pseudo user cn: replication manager objectClass: top objectClass: person sn: bind dn pseudo user userPassword: {PBKDF2_SHA256}AAAIAK03AlrnqCQah1YbQZS5hpP2TCEGgT6D+ArcroXtRnhpT+7JKoGa4kd1znY+63iHQXn2l36Dlx5Fmgn2xFahglRlPInysHKTu0i72fdEvgPq4mWznPGFXDmNtjddP3+jEwjEnFHV++SeL659Yw8Mqtsh4rTbd6IPuopzzzk0a/vEkC/Sb8102VdVslR83fswi0/eF/juG9uh0kUpuax+VPrj2Qf8zO906xKVkSYn27rEfnw4NmH0ljotLMP8WgCEmBw8yelrhvMSjAtlpRrfkpYbPc4OwFJqEZE5mUjMhmzm9A6zCBkS9e2rlvawDWWxNrJR018YVgT43LypTpNuj5gTySSF3a7I8YArYMWpoPdAVQx4ZEiunl7UjoNryYgid5OIOHvcY3TX5dYUbm8A9ArZ4gvZ57Hp+1a1zvZWtiTe INFO  lib389:agreement.py:1169 Starting total init cn=-3meTo_ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:38902,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config
Passed suites/replication/replica_config_test.py::test_replica_num_add[nsDS5ReplicaType--1-4-9999999999999999999999999999999999999999999999999999999999999999999-invalid-1] 0.97
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/replication/replica_config_test.py::test_replica_num_add[nsDS5Flags--1-2-9999999999999999999999999999999999999999999999999999999999999999999-invalid-1] 0.32
No log output captured.
Passed suites/replication/replica_config_test.py::test_replica_num_add[nsDS5ReplicaId-0-65536-9999999999999999999999999999999999999999999999999999999999999999999-invalid-1] 0.32
No log output captured.
Passed suites/replication/replica_config_test.py::test_replica_num_add[nsds5ReplicaPurgeDelay--2-9223372036854775807-9999999999999999999999999999999999999999999999999999999999999999999-invalid-1] 0.06
No log output captured.
Passed suites/replication/replica_config_test.py::test_replica_num_add[nsDS5ReplicaBindDnGroupCheckInterval--2-9223372036854775807-9999999999999999999999999999999999999999999999999999999999999999999-invalid-1] 0.06
No log output captured.
Passed suites/replication/replica_config_test.py::test_replica_num_add[nsds5ReplicaTombstonePurgeInterval--2-9223372036854775807-9999999999999999999999999999999999999999999999999999999999999999999-invalid-1] 0.05
No log output captured.
Passed suites/replication/replica_config_test.py::test_replica_num_add[nsds5ReplicaProtocolTimeout--1-9223372036854775807-9999999999999999999999999999999999999999999999999999999999999999999-invalid-1] 0.05
No log output captured.
Passed suites/replication/replica_config_test.py::test_replica_num_add[nsds5ReplicaReleaseTimeout--1-9223372036854775807-9999999999999999999999999999999999999999999999999999999999999999999-invalid-1] 0.05
No log output captured.
Passed suites/replication/replica_config_test.py::test_replica_num_add[nsds5ReplicaBackoffMin-0-9223372036854775807-9999999999999999999999999999999999999999999999999999999999999999999-invalid-3] 0.05
No log output captured.
Passed suites/replication/replica_config_test.py::test_replica_num_add[nsds5ReplicaBackoffMax-0-9223372036854775807-9999999999999999999999999999999999999999999999999999999999999999999-invalid-6] 0.05
No log output captured.
Passed suites/replication/replica_config_test.py::test_replica_num_modify[nsDS5Flags--1-2-9999999999999999999999999999999999999999999999999999999999999999999-invalid-1] 0.04
No log output captured.
Passed suites/replication/replica_config_test.py::test_replica_num_modify[nsds5ReplicaPurgeDelay--2-9223372036854775807-9999999999999999999999999999999999999999999999999999999999999999999-invalid-1] 0.05
No log output captured.
Passed suites/replication/replica_config_test.py::test_replica_num_modify[nsDS5ReplicaBindDnGroupCheckInterval--2-9223372036854775807-9999999999999999999999999999999999999999999999999999999999999999999-invalid-1] 0.04
No log output captured.
Passed suites/replication/replica_config_test.py::test_replica_num_modify[nsds5ReplicaTombstonePurgeInterval--2-9223372036854775807-9999999999999999999999999999999999999999999999999999999999999999999-invalid-1] 0.30
No log output captured.
Passed suites/replication/replica_config_test.py::test_replica_num_modify[nsds5ReplicaProtocolTimeout--1-9223372036854775807-9999999999999999999999999999999999999999999999999999999999999999999-invalid-1] 0.04
No log output captured.
Passed suites/replication/replica_config_test.py::test_replica_num_modify[nsds5ReplicaReleaseTimeout--1-9223372036854775807-9999999999999999999999999999999999999999999999999999999999999999999-invalid-1] 0.05
No log output captured.
Passed suites/replication/replica_config_test.py::test_replica_num_modify[nsds5ReplicaBackoffMin-0-9223372036854775807-9999999999999999999999999999999999999999999999999999999999999999999-invalid-3] 0.04
No log output captured.
Passed suites/replication/replica_config_test.py::test_replica_num_modify[nsds5ReplicaBackoffMax-0-9223372036854775807-9999999999999999999999999999999999999999999999999999999999999999999-invalid-6] 0.04
No log output captured.
Passed suites/replication/replica_config_test.py::test_same_attr_yields_same_return_code 0.14
No log output captured.
Passed suites/replication/single_master_test.py::test_lastupdate_attr_before_init 0.62
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38902, 'ldap-secureport': 63602, 'server-id': 'standalone2', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  lib389.replica:replica.py:2382 SUCCESS: Agreement from ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:38901 to ldap://ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:38902 is was created
Passed suites/replication/tombstone_fixup_test.py::test_precise_tombstone_purging 24.55
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology.
-------------------------------Captured log call--------------------------------
INFO  lib389:tasks.py:567 Export task export_10202020_205912 for file /var/lib/dirsrv/slapd-master1/ldif/export.ldif completed successfully INFO  lib389.utils:tombstone_fixup_test.py:77 Import replication LDIF file... INFO  lib389:tasks.py:498 Import task import_10202020_205915 for file /var/lib/dirsrv/slapd-master1/ldif/export.ldif completed successfully INFO  lib389:tasks.py:937 tombstone fixup task fixupTombstone_10202020_205919 for backend userRoot completed successfully INFO  lib389:tasks.py:937 tombstone fixup task fixupTombstone_10202020_205921 for backend userRoot completed successfully INFO  lib389.utils:tombstone_fixup_test.py:116 Wait for tombstone purge interval to pass... INFO  lib389.utils:tombstone_fixup_test.py:123 Wait for tombstone purge interval to pass again...
Passed suites/replication/tombstone_test.py::test_purge_success 0.13
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology.
-------------------------------Captured log call--------------------------------
INFO  Tombstone:tombstone.py:165 Reviving nsuniqueid=b4f54006-133811eb-9348c73e-4a0f5526,uid=testuser,ou=People,dc=example,dc=com -> uid=testuser,ou=people,dc=example,dc=com
Passed suites/resource_limits/fdlimits_test.py::test_fd_limits 0.03
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.resource_limits.fdlimits_test:fdlimits_test.py:69 Test PASSED
Passed suites/rewriters/adfilter_test.py::test_adfilter_objectCategory 6.32
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/rewriters/basic_test.py::test_rewriters_container 0.00
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/rewriters/basic_test.py::test_foo_filter_rewriter 4.35
No log output captured.
Passed suites/roles/basic_test.py::test_filterrole 0.74
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/roles/basic_test.py::test_managedrole 0.17
No log output captured.
Passed suites/roles/basic_test.py::test_nestedrole 0.51
No log output captured.
Passed suites/sasl/allowed_mechs_test.py::test_basic_feature 37.06
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
------------------------------Captured stdout call------------------------------
['EXTERNAL', 'GSS-SPNEGO', 'GSSAPI', 'DIGEST-MD5', 'CRAM-MD5', 'LOGIN', 'PLAIN', 'ANONYMOUS']
-------------------------------Captured log call--------------------------------
INFO  lib389:allowed_mechs_test.py:75 Test we have some of the default mechanisms INFO  lib389:allowed_mechs_test.py:83 Edit mechanisms to allow just PLAIN INFO  lib389:allowed_mechs_test.py:91 Restart server and make sure we still have correct allowed mechs INFO  lib389:allowed_mechs_test.py:100 Edit mechanisms to allow just PLAIN and EXTERNAL INFO  lib389:allowed_mechs_test.py:108 Edit mechanisms to allow just PLAIN and GSSAPI INFO  lib389:allowed_mechs_test.py:126 Edit mechanisms to allow just PLAIN, GSSAPI, and ANONYMOUS INFO  lib389:allowed_mechs_test.py:146 Edit mechanisms to allow just PLAIN and ANONYMOUS INFO  lib389:allowed_mechs_test.py:165 Reset allowed mechaisms INFO  lib389:allowed_mechs_test.py:169 Check that we have the original set of mechanisms INFO  lib389:allowed_mechs_test.py:174 Check that we have the original set of mechanisms after a restart
Passed suites/sasl/allowed_mechs_test.py::test_config_set_few_mechs 0.01
-------------------------------Captured log call--------------------------------
INFO  lib389:allowed_mechs_test.py:198 Set nsslapd-allowed-sasl-mechanisms to 'PLAIN GSSAPI' INFO  lib389:allowed_mechs_test.py:201 Verify nsslapd-allowed-sasl-mechanisms has the values
Passed suites/sasl/plain_test.py::test_basic_feature 11.40
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/schema/eduperson_test.py::test_account_locking 0.57
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.schema.eduperson_test:eduperson_test.py:88 Test PASSED
Passed suites/schema/schema_reload_test.py::test_valid_schema 2.14
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.schema.schema_reload_test:schema_reload_test.py:43 Test schema-reload task with valid schema INFO  tests.suites.schema.schema_reload_test:schema_reload_test.py:46 Create valid schema file (99user.ldif)... INFO  tests.suites.schema.schema_reload_test:schema_reload_test.py:62 Run the schema-reload task... INFO  lib389:tasks.py:1169 Schema Reload task (task-10202020_210302) completed successfully INFO  tests.suites.schema.schema_reload_test:schema_reload_test.py:68 The schema reload task worked as expected INFO  tests.suites.schema.schema_reload_test:schema_reload_test.py:71 Check cn=schema to verify the valid schema was added
Passed suites/schema/schema_reload_test.py::test_invalid_schema 2.13
-------------------------------Captured log call--------------------------------
INFO  tests.suites.schema.schema_reload_test:schema_reload_test.py:95 Test schema-reload task with invalid schema INFO  tests.suites.schema.schema_reload_test:schema_reload_test.py:98 Create valid schema file (98user.ldif)... INFO  tests.suites.schema.schema_reload_test:schema_reload_test.py:113 Create invalid schema file (99user.ldif)... INFO  tests.suites.schema.schema_reload_test:schema_reload_test.py:130 Run the schema-reload task, it should fail... ERROR  lib389:tasks.py:1166 Error: Schema Reload task (task-10202020_210305) exited with -2 INFO  tests.suites.schema.schema_reload_test:schema_reload_test.py:136 The schema reload task failed as expected: error -2 INFO  tests.suites.schema.schema_reload_test:schema_reload_test.py:140 Check cn=schema to verify the invalid schema was not added INFO  tests.suites.schema.schema_reload_test:schema_reload_test.py:147 The invalid schema is not present on the server
Passed suites/schema/schema_test.py::test_schema_comparewithfiles 0.11
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.schema.schema_test:schema_test.py:125 Running test_schema_comparewithfiles... WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /etc/dirsrv/slapd-standalone1/schema/99user.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/10mep-plugin.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/60autofs.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/10automember-plugin.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/25java-object.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/00core.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/30ns-common.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/60pam-plugin.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/60posix-winsync-plugin.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/28pilot.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/10rfc2307.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/02common.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/50ns-directory.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/10dna-plugin.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/50ns-admin.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/20subscriber.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/60trust.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/60mozilla.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/60pureftpd.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/05rfc4524.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/60nss-ldap.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/50ns-certificate.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/60eduperson.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/05rfc2927.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/60acctpolicy.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/50ns-web.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/60sudo.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/60rfc3712.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/50ns-mail.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/01core389.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/05rfc4523.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/50ns-value.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/60sabayon.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/06inetorgperson.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/60rfc2739.ldif as a schema file - skipping INFO  tests.suites.schema.schema_test:schema_test.py:166 test_schema_comparewithfiles: PASSED
Passed suites/setup_ds/dscreate_test.py::test_setup_ds_minimal_dry 0.15
-------------------------------Captured log setup-------------------------------
DEBUG  lib389:dscreate_test.py:42 Instance allocated DEBUG  lib389:__init__.py:554 Allocate <class 'lib389.DirSrv'> with None DEBUG  lib389:__init__.py:577 Allocate <class 'lib389.DirSrv'> with ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:54321 DEBUG  lib389:__init__.py:602 Allocate <class 'lib389.DirSrv'> with ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:54321 DEBUG  lib389:__init__.py:742 list instance not found in /etc/dirsrv/slapd-standalone/dse.ldif: standalone
-------------------------------Captured log call--------------------------------
DEBUG  lib389:__init__.py:742 list instance not found in /etc/dirsrv/slapd-standalone/dse.ldif: standalone INFO  LogCapture.SetupDs:setup.py:670 NOOP: Dry run requested DEBUG  lib389:__init__.py:742 list instance not found in /etc/dirsrv/slapd-standalone/dse.ldif: standalone
Passed suites/setup_ds/dscreate_test.py::test_setup_ds_inf_minimal 0.00
-----------------------------Captured stderr setup------------------------------
The following errors occurred during removal: Error: could not find directory server configuration directory 'slapd-standalone'. Error: No such file or directory Error: could not remove directory server standalone
-------------------------------Captured log setup-------------------------------
DEBUG  lib389:dscreate_test.py:42 Instance allocated DEBUG  lib389:__init__.py:554 Allocate <class 'lib389.DirSrv'> with None DEBUG  lib389:__init__.py:577 Allocate <class 'lib389.DirSrv'> with ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:54321 DEBUG  lib389:__init__.py:602 Allocate <class 'lib389.DirSrv'> with ci-vm-10-0-137-110.hosted.upshift.rdu2.redhat.com:54321 DEBUG  lib389:__init__.py:932 running: /usr/sbin/remove-ds.pl -i slapd-standalone
Passed suites/setup_ds/remove_test.py::test_basic[True] 3.47
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/setup_ds/remove_test.py::test_basic[False] 4.14
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/setup_ds/setup_ds_test.py::test_slapd_InstScriptsEnabled[true] 6.83
------------------------------Captured stdout call------------------------------
Instance slapd-standalone1 removed.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.setup_ds.setup_ds_test:setup_ds_test.py:63 set SER_INST_SCRIPTS_ENABLED to true INFO  tests.suites.setup_ds.setup_ds_test:setup_ds_test.py:20 create_instance - Installs the instance and Sets the value of InstScriptsEnabled to true OR false. INFO  tests.suites.setup_ds.setup_ds_test:setup_ds_test.py:22 Set up the instance and set the config_attr INFO  tests.suites.setup_ds.setup_ds_test:setup_ds_test.py:68 checking the presence of instance script directory when SER_INST_SCRIPTS_ENABLED is set to true
Passed suites/setup_ds/setup_ds_test.py::test_slapd_InstScriptsEnabled[false] 7.01
------------------------------Captured stdout call------------------------------
Instance slapd-standalone1 removed.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.setup_ds.setup_ds_test:setup_ds_test.py:63 set SER_INST_SCRIPTS_ENABLED to false INFO  tests.suites.setup_ds.setup_ds_test:setup_ds_test.py:20 create_instance - Installs the instance and Sets the value of InstScriptsEnabled to true OR false. INFO  tests.suites.setup_ds.setup_ds_test:setup_ds_test.py:22 Set up the instance and set the config_attr INFO  tests.suites.setup_ds.setup_ds_test:setup_ds_test.py:72 checking instance script directory does not present when SER_INST_SCRIPTS_ENABLED is set to false
Passed suites/syncrepl_plugin/basic_test.py::test_syncrepl_basic 4.90
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/syntax/acceptance_test.py::test_valid 5.50
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  lib389.topologies.tests.suites.syntax.acceptance_test:acceptance_test.py:54 Clean the error log INFO  lib389.topologies.tests.suites.syntax.acceptance_test:acceptance_test.py:58 Attempting to add task entry... INFO  lib389.topologies.tests.suites.syntax.acceptance_test:acceptance_test.py:65 Found 0 invalid entries - Success
Passed suites/syntax/acceptance_test.py::test_invalid_uidnumber 6.28
-------------------------------Captured log call--------------------------------
INFO  lib389.topologies.tests.suites.syntax.acceptance_test:acceptance_test.py:91 Clean the error log INFO  lib389.topologies.tests.suites.syntax.acceptance_test:acceptance_test.py:98 Attempting to add task entry... INFO  lib389.topologies.tests.suites.syntax.acceptance_test:acceptance_test.py:105 Found an invalid entry with wrong uidNumber - Success
Passed suites/syntax/mr_test.py::test_sss_mr 6.09
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.syntax.mr_test:mr_test.py:42 Creating LDIF... INFO  tests.suites.syntax.mr_test:mr_test.py:47 Importing LDIF... INFO  tests.suites.syntax.mr_test:mr_test.py:52 Search using server side sorting using undefined mr in the attr... INFO  tests.suites.syntax.mr_test:mr_test.py:62 Test PASSED
Passed suites/tls/cipher_test.py::test_long_cipher_list 19.29
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/tls/ssl_version_test.py::test_ssl_version_range 65.41
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.tls.ssl_version_test:ssl_version_test.py:36 default min: TLS1.2 max: TLS1.3
Passed suites/tls/tls_cert_namespace_test.py::test_pem_cert_in_private_namespace 11.33
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.tls.tls_cert_namespace_test:tls_cert_namespace_test.py:54 Enable TLS INFO  tests.suites.tls.tls_cert_namespace_test:tls_cert_namespace_test.py:57 Checking PrivateTmp value INFO  tests.suites.tls.tls_cert_namespace_test:tls_cert_namespace_test.py:59 Command used : systemctl show -p PrivateTmp dirsrv@standalone1.service INFO  tests.suites.tls.tls_cert_namespace_test:tls_cert_namespace_test.py:63 Check files in private /tmp INFO  tests.suites.tls.tls_cert_namespace_test:tls_cert_namespace_test.py:68 Check that Self-Signed-CA.pem is present in private /tmp INFO  tests.suites.tls.tls_cert_namespace_test:tls_cert_namespace_test.py:68 Check that Server-Cert-Key.pem is present in private /tmp INFO  tests.suites.tls.tls_cert_namespace_test:tls_cert_namespace_test.py:68 Check that Server-Cert.pem is present in private /tmp INFO  tests.suites.tls.tls_cert_namespace_test:tls_cert_namespace_test.py:71 Check instance cert directory INFO  tests.suites.tls.tls_cert_namespace_test:tls_cert_namespace_test.py:75 Check that Self-Signed-CA.pem is not present in /etc/dirsrv/slapd-standalone1/ directory INFO  tests.suites.tls.tls_cert_namespace_test:tls_cert_namespace_test.py:75 Check that Server-Cert-Key.pem is not present in /etc/dirsrv/slapd-standalone1/ directory INFO  tests.suites.tls.tls_cert_namespace_test:tls_cert_namespace_test.py:75 Check that Server-Cert.pem is not present in /etc/dirsrv/slapd-standalone1/ directory
Passed suites/tls/tls_cert_namespace_test.py::test_cert_category_authority 8.82
-------------------------------Captured log call--------------------------------
INFO  tests.suites.tls.tls_cert_namespace_test:tls_cert_namespace_test.py:106 Enable TLS INFO  tests.suites.tls.tls_cert_namespace_test:tls_cert_namespace_test.py:109 Get certificate path INFO  tests.suites.tls.tls_cert_namespace_test:tls_cert_namespace_test.py:115 Check that Self-Signed-CA.pem is present INFO  tests.suites.tls.tls_cert_namespace_test:tls_cert_namespace_test.py:119 Trust the certificate INFO  tests.suites.tls.tls_cert_namespace_test:tls_cert_namespace_test.py:122 Search if our certificate has category: authority
Passed suites/tls/tls_check_crl_test.py::test_tls_check_crl 11.25
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/tls/tls_ldaps_only_test.py::test_tls_ldaps_only 22.91
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed tickets/ticket47560_test.py::test_ticket47560 17.26
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  lib389:tasks.py:877 fixupMemberOf task fixupmemberof_10202020_211019 for basedn dc=example,dc=com completed successfully
Passed tickets/ticket47640_test.py::test_ticket47640 0.09
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket47640_test:ticket47640_test.py:65 Add operation correctly rejected. INFO  tests.tickets.ticket47640_test:ticket47640_test.py:75 Test complete
Passed tickets/ticket47714_test.py::test_ticket47714_init 0.22
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47714_test.py:38 ############################################### INFO  lib389:ticket47714_test.py:39 ####### INFO  lib389:ticket47714_test.py:40 ####### Testing Ticket 47714 - [RFE] Update lastLoginTime also in Account Policy plugin if account lockout is based on passwordExpirationTime. INFO  lib389:ticket47714_test.py:41 ####### INFO  lib389:ticket47714_test.py:42 ############################################### INFO  lib389.utils:ticket47714_test.py:55 ######################### Adding Account Policy entry: cn=Account Inactivation Policy,dc=example,dc=com ###################### INFO  lib389.utils:ticket47714_test.py:60 ######################### Adding Test User entry: uid=ticket47714user,dc=example,dc=com ######################
Passed tickets/ticket47714_test.py::test_ticket47714_run_0 10.86
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47714_test.py:38 ############################################### INFO  lib389:ticket47714_test.py:39 ####### INFO  lib389:ticket47714_test.py:40 ####### Account Policy - No new attr alwaysRecordLoginAttr in config INFO  lib389:ticket47714_test.py:41 ####### INFO  lib389:ticket47714_test.py:42 ############################################### INFO  lib389.utils:ticket47714_test.py:96 ######################### Bind as uid=ticket47714user,dc=example,dc=com ###################### INFO  lib389.utils:ticket47714_test.py:109 ######################### Bind as uid=ticket47714user,dc=example,dc=com again ###################### INFO  lib389.utils:ticket47714_test.py:122 First lastLoginTime: b'20201021011334Z', Second lastLoginTime: b'20201021011336Z' INFO  lib389.utils:ticket47714_test.py:133 ######################### cn=config,cn=Account Policy Plugin,cn=plugins,cn=config ###################### INFO  lib389.utils:ticket47714_test.py:134 accountInactivityLimit: b'1' INFO  lib389.utils:ticket47714_test.py:135 ######################### cn=config,cn=Account Policy Plugin,cn=plugins,cn=config DONE ###################### INFO  lib389.utils:ticket47714_test.py:137 ######################### Bind as uid=ticket47714user,dc=example,dc=com again to fail ###################### INFO  lib389.utils:ticket47714_test.py:141 CONSTRAINT VIOLATION Constraint violation INFO  lib389.utils:ticket47714_test.py:142 uid=ticket47714user,dc=example,dc=com was successfully inactivated.
Passed tickets/ticket47714_test.py::test_ticket47714_run_1 5.91
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47714_test.py:38 ############################################### INFO  lib389:ticket47714_test.py:39 ####### INFO  lib389:ticket47714_test.py:40 ####### Account Policy - With new attr alwaysRecordLoginAttr in config INFO  lib389:ticket47714_test.py:41 ####### INFO  lib389:ticket47714_test.py:42 ############################################### INFO  lib389.utils:ticket47714_test.py:179 ######################### Bind as uid=ticket47714user,dc=example,dc=com ###################### INFO  lib389.utils:ticket47714_test.py:191 ######################### Bind as uid=ticket47714user,dc=example,dc=com again ###################### INFO  lib389.utils:ticket47714_test.py:203 First lastLoginTime: b'20201021011344Z', Second lastLoginTime: b'20201021011345Z' INFO  lib389:ticket47714_test.py:206 ticket47714 was successfully verified.
Passed tickets/ticket47808_test.py::test_ticket47808_run 3.66
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47808_test.py:36 Bind as cn=Directory Manager INFO  lib389:ticket47808_test.py:39 ######################### SETUP ATTR UNIQ PLUGIN ###################### INFO  lib389:ticket47808_test.py:46 ######################### ADD USER 1 ###################### INFO  lib389:ticket47808_test.py:55 Try to add Add dn: cn=test_entry 1, dc=example,dc=com cn: test_entry 1 objectclass: top objectclass: person sn: test_entry : dn: cn=test_entry 1, dc=example,dc=com cn: test_entry 1 objectclass: top objectclass: person sn: test_entry INFO  lib389:ticket47808_test.py:58 ######################### Restart Server ###################### INFO  lib389:ticket47808_test.py:62 ######################### ADD USER 2 ###################### INFO  lib389:ticket47808_test.py:71 Try to add Add dn: cn=test_entry 2, dc=example,dc=com cn: test_entry 2 objectclass: top objectclass: person sn: test_entry : dn: cn=test_entry 2, dc=example,dc=com cn: test_entry 2 objectclass: top objectclass: person sn: test_entry WARNING  lib389:ticket47808_test.py:75 Adding cn=test_entry 2, dc=example,dc=com failed INFO  lib389:ticket47808_test.py:78 ######################### IS SERVER UP? ###################### INFO  lib389:ticket47808_test.py:81 Yes, it's up. INFO  lib389:ticket47808_test.py:83 ######################### CHECK USER 2 NOT ADDED ###################### INFO  lib389:ticket47808_test.py:84 Try to search cn=test_entry 2, dc=example,dc=com INFO  lib389:ticket47808_test.py:88 Found none INFO  lib389:ticket47808_test.py:90 ######################### DELETE USER 1 ###################### INFO  lib389:ticket47808_test.py:92 Try to delete cn=test_entry 1, dc=example,dc=com
Passed tickets/ticket47823_test.py::test_ticket47823_init 12.26
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed tickets/ticket47823_test.py::test_ticket47823_one_container_add 18.02
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47823_test.py:58 ############################################### INFO  lib389:ticket47823_test.py:59 ####### INFO  lib389:ticket47823_test.py:60 ####### With former config (args), check attribute uniqueness with 'cn' (ADD) INFO  lib389:ticket47823_test.py:61 ####### INFO  lib389:ticket47823_test.py:62 ############################################### INFO  lib389:ticket47823_test.py:140 Uniqueness not enforced: create the entries INFO  lib389:ticket47823_test.py:155 Uniqueness enforced: checks second entry is rejected INFO  lib389:ticket47823_test.py:58 ############################################### INFO  lib389:ticket47823_test.py:59 ####### INFO  lib389:ticket47823_test.py:60 ####### With new config (args), check attribute uniqueness with 'cn' (ADD) INFO  lib389:ticket47823_test.py:61 ####### INFO  lib389:ticket47823_test.py:62 ############################################### INFO  lib389:ticket47823_test.py:140 Uniqueness not enforced: create the entries INFO  lib389:ticket47823_test.py:155 Uniqueness enforced: checks second entry is rejected
Passed tickets/ticket47823_test.py::test_ticket47823_one_container_mod 9.30
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47823_test.py:58 ############################################### INFO  lib389:ticket47823_test.py:59 ####### INFO  lib389:ticket47823_test.py:60 ####### With former config (args), check attribute uniqueness with 'cn' (MOD) INFO  lib389:ticket47823_test.py:61 ####### INFO  lib389:ticket47823_test.py:62 ############################################### INFO  lib389:ticket47823_test.py:193 Uniqueness enforced: checks MOD ADD entry is rejected INFO  lib389:ticket47823_test.py:210 Uniqueness enforced: checks MOD REPLACE entry is rejected INFO  lib389:ticket47823_test.py:58 ############################################### INFO  lib389:ticket47823_test.py:59 ####### INFO  lib389:ticket47823_test.py:60 ####### With new config (args), check attribute uniqueness with 'cn' (MOD) INFO  lib389:ticket47823_test.py:61 ####### INFO  lib389:ticket47823_test.py:62 ############################################### INFO  lib389:ticket47823_test.py:193 Uniqueness enforced: checks MOD ADD entry is rejected INFO  lib389:ticket47823_test.py:210 Uniqueness enforced: checks MOD REPLACE entry is rejected
Passed tickets/ticket47823_test.py::test_ticket47823_one_container_modrdn 9.36
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47823_test.py:58 ############################################### INFO  lib389:ticket47823_test.py:59 ####### INFO  lib389:ticket47823_test.py:60 ####### With former config (args), check attribute uniqueness with 'cn' (MODRDN) INFO  lib389:ticket47823_test.py:61 ####### INFO  lib389:ticket47823_test.py:62 ############################################### INFO  lib389:ticket47823_test.py:237 Uniqueness enforced: checks MODRDN entry is rejected INFO  lib389:ticket47823_test.py:58 ############################################### INFO  lib389:ticket47823_test.py:59 ####### INFO  lib389:ticket47823_test.py:60 ####### With former config (args), check attribute uniqueness with 'cn' (MODRDN) INFO  lib389:ticket47823_test.py:61 ####### INFO  lib389:ticket47823_test.py:62 ############################################### INFO  lib389:ticket47823_test.py:237 Uniqueness enforced: checks MODRDN entry is rejected
Passed tickets/ticket47823_test.py::test_ticket47823_multi_containers_add 9.00
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47823_test.py:58 ############################################### INFO  lib389:ticket47823_test.py:59 ####### INFO  lib389:ticket47823_test.py:60 ####### With former config (args), check attribute uniqueness with 'cn' (ADD) INFO  lib389:ticket47823_test.py:61 ####### INFO  lib389:ticket47823_test.py:62 ############################################### INFO  lib389:ticket47823_test.py:58 ############################################### INFO  lib389:ticket47823_test.py:59 ####### INFO  lib389:ticket47823_test.py:60 ####### With new config (args), check attribute uniqueness with 'cn' (ADD) INFO  lib389:ticket47823_test.py:61 ####### INFO  lib389:ticket47823_test.py:62 ###############################################
Passed tickets/ticket47823_test.py::test_ticket47823_multi_containers_mod 9.50
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47823_test.py:58 ############################################### INFO  lib389:ticket47823_test.py:59 ####### INFO  lib389:ticket47823_test.py:60 ####### With former config (args), check attribute uniqueness with 'cn' (MOD) on separated container INFO  lib389:ticket47823_test.py:61 ####### INFO  lib389:ticket47823_test.py:62 ############################################### INFO  lib389:ticket47823_test.py:525 Uniqueness not enforced: if same 'cn' modified (add/replace) on separated containers INFO  lib389:ticket47823_test.py:58 ############################################### INFO  lib389:ticket47823_test.py:59 ####### INFO  lib389:ticket47823_test.py:60 ####### With new config (args), check attribute uniqueness with 'cn' (MOD) on separated container INFO  lib389:ticket47823_test.py:61 ####### INFO  lib389:ticket47823_test.py:62 ############################################### INFO  lib389:ticket47823_test.py:531 Uniqueness not enforced: if same 'cn' modified (add/replace) on separated containers
Passed tickets/ticket47823_test.py::test_ticket47823_multi_containers_modrdn 8.95
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47823_test.py:58 ############################################### INFO  lib389:ticket47823_test.py:59 ####### INFO  lib389:ticket47823_test.py:60 ####### With former config (args), check attribute uniqueness with 'cn' (MODRDN) on separated containers INFO  lib389:ticket47823_test.py:61 ####### INFO  lib389:ticket47823_test.py:62 ############################################### INFO  lib389:ticket47823_test.py:545 Uniqueness not enforced: checks MODRDN entry is accepted on separated containers INFO  lib389:ticket47823_test.py:548 Uniqueness not enforced: checks MODRDN entry is accepted on separated containers
Passed tickets/ticket47823_test.py::test_ticket47823_across_multi_containers_add 5.03
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47823_test.py:58 ############################################### INFO  lib389:ticket47823_test.py:59 ####### INFO  lib389:ticket47823_test.py:60 ####### With new config (args), check attribute uniqueness with 'cn' (ADD) across several containers INFO  lib389:ticket47823_test.py:61 ####### INFO  lib389:ticket47823_test.py:62 ###############################################
Passed tickets/ticket47823_test.py::test_ticket47823_across_multi_containers_mod 4.09
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47823_test.py:58 ############################################### INFO  lib389:ticket47823_test.py:59 ####### INFO  lib389:ticket47823_test.py:60 ####### With new config (args), check attribute uniqueness with 'cn' (MOD) across several containers INFO  lib389:ticket47823_test.py:61 ####### INFO  lib389:ticket47823_test.py:62 ###############################################
Passed tickets/ticket47823_test.py::test_ticket47823_across_multi_containers_modrdn 4.72
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47823_test.py:58 ############################################### INFO  lib389:ticket47823_test.py:59 ####### INFO  lib389:ticket47823_test.py:60 ####### With new config (args), check attribute uniqueness with 'cn' (MODRDN) across several containers INFO  lib389:ticket47823_test.py:61 ####### INFO  lib389:ticket47823_test.py:62 ###############################################
Passed tickets/ticket47823_test.py::test_ticket47823_invalid_config_1 11.85
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47823_test.py:58 ############################################### INFO  lib389:ticket47823_test.py:59 ####### INFO  lib389:ticket47823_test.py:60 ####### Invalid config (old): arg0 is missing INFO  lib389:ticket47823_test.py:61 ####### INFO  lib389:ticket47823_test.py:62 ###############################################
Passed tickets/ticket47823_test.py::test_ticket47823_invalid_config_2 9.44
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47823_test.py:58 ############################################### INFO  lib389:ticket47823_test.py:59 ####### INFO  lib389:ticket47823_test.py:60 ####### Invalid config (old): arg1 is missing INFO  lib389:ticket47823_test.py:61 ####### INFO  lib389:ticket47823_test.py:62 ###############################################
Passed tickets/ticket47823_test.py::test_ticket47823_invalid_config_3 10.22
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47823_test.py:58 ############################################### INFO  lib389:ticket47823_test.py:59 ####### INFO  lib389:ticket47823_test.py:60 ####### Invalid config (old): arg0 is missing but new config attrname exists INFO  lib389:ticket47823_test.py:61 ####### INFO  lib389:ticket47823_test.py:62 ###############################################
Passed tickets/ticket47823_test.py::test_ticket47823_invalid_config_4 9.39
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47823_test.py:58 ############################################### INFO  lib389:ticket47823_test.py:59 ####### INFO  lib389:ticket47823_test.py:60 ####### Invalid config (old): arg1 is missing but new config exist INFO  lib389:ticket47823_test.py:61 ####### INFO  lib389:ticket47823_test.py:62 ###############################################
Passed tickets/ticket47823_test.py::test_ticket47823_invalid_config_5 9.04
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47823_test.py:58 ############################################### INFO  lib389:ticket47823_test.py:59 ####### INFO  lib389:ticket47823_test.py:60 ####### Invalid config (new): uniqueness-attribute-name is missing INFO  lib389:ticket47823_test.py:61 ####### INFO  lib389:ticket47823_test.py:62 ###############################################
Passed tickets/ticket47823_test.py::test_ticket47823_invalid_config_6 9.07
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47823_test.py:58 ############################################### INFO  lib389:ticket47823_test.py:59 ####### INFO  lib389:ticket47823_test.py:60 ####### Invalid config (new): uniqueness-subtrees is missing INFO  lib389:ticket47823_test.py:61 ####### INFO  lib389:ticket47823_test.py:62 ###############################################
Passed tickets/ticket47823_test.py::test_ticket47823_invalid_config_7 9.76
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47823_test.py:58 ############################################### INFO  lib389:ticket47823_test.py:59 ####### INFO  lib389:ticket47823_test.py:60 ####### Invalid config (new): uniqueness-subtrees are invalid INFO  lib389:ticket47823_test.py:61 ####### INFO  lib389:ticket47823_test.py:62 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_init 4.56
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed tickets/ticket47828_test.py::test_ticket47828_run_0 0.26
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### NO exclude scope: Add an active entry and check its ALLOCATED_ATTR is set INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_1 0.01
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### NO exclude scope: Add an active entry and check its ALLOCATED_ATTR is unchanged (!= magic) INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_2 0.02
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### NO exclude scope: Add a staged entry and check its ALLOCATED_ATTR is set INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_3 0.40
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### NO exclude scope: Add a staged entry and check its ALLOCATED_ATTR is unchanged (!= magic) INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_4 0.01
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### Exclude the provisioning container INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_5 0.03
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### Provisioning excluded scope: Add an active entry and check its ALLOCATED_ATTR is set INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_6 0.02
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### Provisioning excluded scope: Add an active entry and check its ALLOCATED_ATTR is unchanged (!= magic) INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_7 0.02
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### Provisioning excluded scope: Add a staged entry and check its ALLOCATED_ATTR is not set INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_8 0.01
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### Provisioning excluded scope: Add a staged entry and check its ALLOCATED_ATTR is unchanged (!= magic) INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_9 0.02
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### Provisioning excluded scope: Add an dummy entry and check its ALLOCATED_ATTR is set INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_10 0.02
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### Provisioning excluded scope: Add an dummy entry and check its ALLOCATED_ATTR is unchanged (!= magic) INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_11 0.01
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### Exclude (in addition) the dummy container INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_12 0.02
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### Provisioning/Dummy excluded scope: Add an active entry and check its ALLOCATED_ATTR is set INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_13 0.26
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### Provisioning/Dummy excluded scope: Add an active entry and check its ALLOCATED_ATTR is unchanged (!= magic) INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_14 0.01
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### Provisioning/Dummy excluded scope: Add a staged entry and check its ALLOCATED_ATTR is not set INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_15 0.01
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### Provisioning/Dummy excluded scope: Add a staged entry and check its ALLOCATED_ATTR is unchanged (!= magic) INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_16 0.01
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### Provisioning/Dummy excluded scope: Add an dummy entry and check its ALLOCATED_ATTR not is set INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_17 0.02
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### Provisioning/Dummy excluded scope: Add an dummy entry and check its ALLOCATED_ATTR is unchanged (!= magic) INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_18 0.01
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### Exclude PROVISIONING and a wrong container INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_19 0.02
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### Provisioning+wrong container excluded scope: Add an active entry and check its ALLOCATED_ATTR is set INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_20 0.01
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### Provisioning+wrong container excluded scope: Add an active entry and check its ALLOCATED_ATTR is unchanged (!= magic) INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_21 0.01
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### Provisioning+wrong container excluded scope: Add a staged entry and check its ALLOCATED_ATTR is not set INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_22 0.02
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### Provisioning+wrong container excluded scope: Add a staged entry and check its ALLOCATED_ATTR is unchanged (!= magic) INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_23 0.02
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### Provisioning+wrong container excluded scope: Add an dummy entry and check its ALLOCATED_ATTR is set INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_24 0.01
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### Provisioning+wrong container excluded scope: Add an dummy entry and check its ALLOCATED_ATTR is unchanged (!= magic) INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_25 0.01
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### Exclude a wrong container INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_26 0.02
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### Wrong container excluded scope: Add an active entry and check its ALLOCATED_ATTR is set INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_27 0.01
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### Wrong container excluded scope: Add an active entry and check its ALLOCATED_ATTR is unchanged (!= magic) INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_28 0.01
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### Wrong container excluded scope: Add a staged entry and check its ALLOCATED_ATTR is not set INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_29 0.01
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### Wrong container excluded scope: Add a staged entry and check its ALLOCATED_ATTR is unchanged (!= magic) INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_30 0.02
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### Wrong container excluded scope: Add an dummy entry and check its ALLOCATED_ATTR is set INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_31 0.01
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### Wrong container excluded scope: Add an dummy entry and check its ALLOCATED_ATTR is unchanged (!= magic) INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47829_test.py::test_ticket47829_init 4.52
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed tickets/ticket47829_test.py::test_ticket47829_mod_active_user_1 2.05
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47829_test.py:62 ############################################### INFO  lib389:ticket47829_test.py:63 ####### INFO  lib389:ticket47829_test.py:64 ####### MOD: add an active user to an active group INFO  lib389:ticket47829_test.py:65 ####### INFO  lib389:ticket47829_test.py:66 ############################################### INFO  lib389:ticket47829_test.py:172 add entry cn=active guy,cn=accounts,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=active group,cn=accounts,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:96 !!!!!!! cn=active guy,cn=accounts,cn=in,dc=example,dc=com: memberof->b'cn=active group,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:116 !!!!!!! cn=active group,cn=accounts,cn=in,dc=example,dc=com: member ->b'cn=active guy,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:172 delete entry cn=active guy,cn=accounts,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=active group,cn=accounts,cn=in,dc=example,dc=com
Passed tickets/ticket47829_test.py::test_ticket47829_mod_active_user_2 2.28
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47829_test.py:62 ############################################### INFO  lib389:ticket47829_test.py:63 ####### INFO  lib389:ticket47829_test.py:64 ####### MOD: add an Active user to a Stage group INFO  lib389:ticket47829_test.py:65 ####### INFO  lib389:ticket47829_test.py:66 ############################################### INFO  lib389:ticket47829_test.py:172 add entry cn=active guy,cn=accounts,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=stage group,cn=staged users,cn=provisioning,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:116 !!!!!!! cn=stage group,cn=staged users,cn=provisioning,cn=in,dc=example,dc=com: member ->b'cn=active guy,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:172 delete entry cn=active guy,cn=accounts,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=stage group,cn=staged users,cn=provisioning,cn=in,dc=example,dc=com
Passed tickets/ticket47829_test.py::test_ticket47829_mod_active_user_3 2.02
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47829_test.py:62 ############################################### INFO  lib389:ticket47829_test.py:63 ####### INFO  lib389:ticket47829_test.py:64 ####### MOD: add an Active user to a out of scope group INFO  lib389:ticket47829_test.py:65 ####### INFO  lib389:ticket47829_test.py:66 ############################################### INFO  lib389:ticket47829_test.py:172 add entry cn=active guy,cn=accounts,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=out group,cn=out,dc=example,dc=com INFO  lib389:ticket47829_test.py:116 !!!!!!! cn=out group,cn=out,dc=example,dc=com: member ->b'cn=active guy,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:172 delete entry cn=active guy,cn=accounts,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=out group,cn=out,dc=example,dc=com
Passed tickets/ticket47829_test.py::test_ticket47829_mod_stage_user_1 2.02
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47829_test.py:62 ############################################### INFO  lib389:ticket47829_test.py:63 ####### INFO  lib389:ticket47829_test.py:64 ####### MOD: add an Stage user to a Active group INFO  lib389:ticket47829_test.py:65 ####### INFO  lib389:ticket47829_test.py:66 ############################################### INFO  lib389:ticket47829_test.py:172 add entry cn=stage guy,cn=staged users,cn=provisioning,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=active group,cn=accounts,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:116 !!!!!!! cn=active group,cn=accounts,cn=in,dc=example,dc=com: member ->b'cn=stage guy,cn=staged users,cn=provisioning,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:172 delete entry cn=stage guy,cn=staged users,cn=provisioning,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=active group,cn=accounts,cn=in,dc=example,dc=com
Passed tickets/ticket47829_test.py::test_ticket47829_mod_stage_user_2 2.02
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47829_test.py:62 ############################################### INFO  lib389:ticket47829_test.py:63 ####### INFO  lib389:ticket47829_test.py:64 ####### MOD: add an Stage user to a Stage group INFO  lib389:ticket47829_test.py:65 ####### INFO  lib389:ticket47829_test.py:66 ############################################### INFO  lib389:ticket47829_test.py:172 add entry cn=stage guy,cn=staged users,cn=provisioning,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=stage group,cn=staged users,cn=provisioning,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:116 !!!!!!! cn=stage group,cn=staged users,cn=provisioning,cn=in,dc=example,dc=com: member ->b'cn=stage guy,cn=staged users,cn=provisioning,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:172 delete entry cn=stage guy,cn=staged users,cn=provisioning,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=stage group,cn=staged users,cn=provisioning,cn=in,dc=example,dc=com
Passed tickets/ticket47829_test.py::test_ticket47829_mod_stage_user_3 2.01
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47829_test.py:62 ############################################### INFO  lib389:ticket47829_test.py:63 ####### INFO  lib389:ticket47829_test.py:64 ####### MOD: add an Stage user to a out of scope group INFO  lib389:ticket47829_test.py:65 ####### INFO  lib389:ticket47829_test.py:66 ############################################### INFO  lib389:ticket47829_test.py:172 add entry cn=stage guy,cn=staged users,cn=provisioning,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=out group,cn=out,dc=example,dc=com INFO  lib389:ticket47829_test.py:116 !!!!!!! cn=out group,cn=out,dc=example,dc=com: member ->b'cn=stage guy,cn=staged users,cn=provisioning,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:172 delete entry cn=stage guy,cn=staged users,cn=provisioning,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=out group,cn=out,dc=example,dc=com
Passed tickets/ticket47829_test.py::test_ticket47829_mod_out_user_1 2.03
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47829_test.py:62 ############################################### INFO  lib389:ticket47829_test.py:63 ####### INFO  lib389:ticket47829_test.py:64 ####### MOD: add an out of scope user to an active group INFO  lib389:ticket47829_test.py:65 ####### INFO  lib389:ticket47829_test.py:66 ############################################### INFO  lib389:ticket47829_test.py:172 add entry cn=out guy,cn=out,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=active group,cn=accounts,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:116 !!!!!!! cn=active group,cn=accounts,cn=in,dc=example,dc=com: member ->b'cn=out guy,cn=out,dc=example,dc=com' INFO  lib389:ticket47829_test.py:172 delete entry cn=out guy,cn=out,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=active group,cn=accounts,cn=in,dc=example,dc=com
Passed tickets/ticket47829_test.py::test_ticket47829_mod_out_user_2 2.02
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47829_test.py:62 ############################################### INFO  lib389:ticket47829_test.py:63 ####### INFO  lib389:ticket47829_test.py:64 ####### MOD: add an out of scope user to a Stage group INFO  lib389:ticket47829_test.py:65 ####### INFO  lib389:ticket47829_test.py:66 ############################################### INFO  lib389:ticket47829_test.py:172 add entry cn=out guy,cn=out,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=stage group,cn=staged users,cn=provisioning,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:116 !!!!!!! cn=stage group,cn=staged users,cn=provisioning,cn=in,dc=example,dc=com: member ->b'cn=out guy,cn=out,dc=example,dc=com' INFO  lib389:ticket47829_test.py:172 delete entry cn=out guy,cn=out,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=stage group,cn=staged users,cn=provisioning,cn=in,dc=example,dc=com
Passed tickets/ticket47829_test.py::test_ticket47829_mod_out_user_3 2.01
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47829_test.py:62 ############################################### INFO  lib389:ticket47829_test.py:63 ####### INFO  lib389:ticket47829_test.py:64 ####### MOD: add an out of scope user to an out of scope group INFO  lib389:ticket47829_test.py:65 ####### INFO  lib389:ticket47829_test.py:66 ############################################### INFO  lib389:ticket47829_test.py:172 add entry cn=out guy,cn=out,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=out group,cn=out,dc=example,dc=com INFO  lib389:ticket47829_test.py:116 !!!!!!! cn=out group,cn=out,dc=example,dc=com: member ->b'cn=out guy,cn=out,dc=example,dc=com' INFO  lib389:ticket47829_test.py:172 delete entry cn=out guy,cn=out,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=out group,cn=out,dc=example,dc=com
Passed tickets/ticket47829_test.py::test_ticket47829_mod_active_user_modrdn_active_user_1 2.05
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47829_test.py:62 ############################################### INFO  lib389:ticket47829_test.py:63 ####### INFO  lib389:ticket47829_test.py:64 ####### add an Active user to a Active group. Then move Active user to Active INFO  lib389:ticket47829_test.py:65 ####### INFO  lib389:ticket47829_test.py:66 ############################################### INFO  lib389:ticket47829_test.py:172 add entry cn=active guy,cn=accounts,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=active group,cn=accounts,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:96 !!!!!!! cn=active guy,cn=accounts,cn=in,dc=example,dc=com: memberof->b'cn=active group,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:116 !!!!!!! cn=active group,cn=accounts,cn=in,dc=example,dc=com: member ->b'cn=active guy,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:132 ######################### MODRDN cn=xactive guy ###################### INFO  lib389:ticket47829_test.py:96 !!!!!!! cn=xactive guy,cn=accounts,cn=in,dc=example,dc=com: memberof->b'cn=active group,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:116 !!!!!!! cn=active group,cn=accounts,cn=in,dc=example,dc=com: member ->b'cn=xactive guy,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:132 ######################### MODRDN cn=active guy ###################### INFO  lib389:ticket47829_test.py:96 !!!!!!! cn=active guy,cn=accounts,cn=in,dc=example,dc=com: memberof->b'cn=active group,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:116 !!!!!!! cn=active group,cn=accounts,cn=in,dc=example,dc=com: member ->b'cn=active guy,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:172 delete entry cn=active guy,cn=accounts,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=active group,cn=accounts,cn=in,dc=example,dc=com
Passed tickets/ticket47829_test.py::test_ticket47829_mod_active_user_modrdn_stage_user_1 1.06
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47829_test.py:62 ############################################### INFO  lib389:ticket47829_test.py:63 ####### INFO  lib389:ticket47829_test.py:64 ####### add an Active user to a Active group. Then move Active user to Stage INFO  lib389:ticket47829_test.py:65 ####### INFO  lib389:ticket47829_test.py:66 ############################################### INFO  lib389:ticket47829_test.py:172 add entry cn=active guy,cn=accounts,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=active group,cn=accounts,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:96 !!!!!!! cn=active guy,cn=accounts,cn=in,dc=example,dc=com: memberof->b'cn=active group,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:116 !!!!!!! cn=active group,cn=accounts,cn=in,dc=example,dc=com: member ->b'cn=active guy,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:132 ######################### MODRDN cn=active guy ###################### INFO  lib389:ticket47829_test.py:132 ######################### MODRDN cn=active guy ######################
Passed tickets/ticket47829_test.py::test_ticket47829_mod_active_user_modrdn_out_user_1 1.03
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47829_test.py:62 ############################################### INFO  lib389:ticket47829_test.py:63 ####### INFO  lib389:ticket47829_test.py:64 ####### add an Active user to a Active group. Then move Active user to out of scope INFO  lib389:ticket47829_test.py:65 ####### INFO  lib389:ticket47829_test.py:66 ############################################### INFO  lib389:ticket47829_test.py:172 add entry cn=active guy,cn=accounts,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=active group,cn=accounts,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:96 !!!!!!! cn=active guy,cn=accounts,cn=in,dc=example,dc=com: memberof->b'cn=active group,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:116 !!!!!!! cn=active group,cn=accounts,cn=in,dc=example,dc=com: member ->b'cn=active guy,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:132 ######################### MODRDN cn=active guy ###################### INFO  lib389:ticket47829_test.py:132 ######################### MODRDN cn=active guy ######################
Passed tickets/ticket47829_test.py::test_ticket47829_mod_modrdn_1 1.05
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47829_test.py:62 ############################################### INFO  lib389:ticket47829_test.py:63 ####### INFO  lib389:ticket47829_test.py:64 ####### add an Stage user to a Active group. Then move Stage user to Active INFO  lib389:ticket47829_test.py:65 ####### INFO  lib389:ticket47829_test.py:66 ############################################### INFO  lib389:ticket47829_test.py:172 add entry cn=stage guy,cn=staged users,cn=provisioning,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=active group,cn=accounts,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:116 !!!!!!! cn=active group,cn=accounts,cn=in,dc=example,dc=com: member ->b'cn=stage guy,cn=staged users,cn=provisioning,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:132 ######################### MODRDN cn=stage guy ###################### INFO  lib389:ticket47829_test.py:96 !!!!!!! cn=stage guy,cn=accounts,cn=in,dc=example,dc=com: memberof->b'cn=active group,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:116 !!!!!!! cn=active group,cn=accounts,cn=in,dc=example,dc=com: member ->b'cn=stage guy,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:132 ######################### MODRDN cn=stage guy ######################
Passed tickets/ticket47829_test.py::test_ticket47829_mod_stage_user_modrdn_active_user_1 1.03
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47829_test.py:62 ############################################### INFO  lib389:ticket47829_test.py:63 ####### INFO  lib389:ticket47829_test.py:64 ####### add an Stage user to a Active group. Then move Stage user to Active INFO  lib389:ticket47829_test.py:65 ####### INFO  lib389:ticket47829_test.py:66 ############################################### INFO  lib389:ticket47829_test.py:172 add entry cn=stage guy,cn=staged users,cn=provisioning,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=active group,cn=accounts,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:116 !!!!!!! cn=active group,cn=accounts,cn=in,dc=example,dc=com: member ->b'cn=stage guy,cn=staged users,cn=provisioning,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:132 ######################### MODRDN cn=stage guy ###################### INFO  lib389:ticket47829_test.py:96 !!!!!!! cn=stage guy,cn=accounts,cn=in,dc=example,dc=com: memberof->b'cn=active group,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:116 !!!!!!! cn=active group,cn=accounts,cn=in,dc=example,dc=com: member ->b'cn=stage guy,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:132 ######################### MODRDN cn=stage guy ######################
Passed tickets/ticket47829_test.py::test_ticket47829_mod_stage_user_modrdn_stage_user_1 0.00
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47829_test.py:62 ############################################### INFO  lib389:ticket47829_test.py:63 ####### INFO  lib389:ticket47829_test.py:64 ####### add an Stage user to a Active group. Then move Stage user to Stage INFO  lib389:ticket47829_test.py:65 ####### INFO  lib389:ticket47829_test.py:66 ############################################### INFO  lib389:ticket47829_test.py:62 ############################################### INFO  lib389:ticket47829_test.py:63 ####### INFO  lib389:ticket47829_test.py:64 ####### Return because it requires a fix for 47833 INFO  lib389:ticket47829_test.py:65 ####### INFO  lib389:ticket47829_test.py:66 ###############################################
Passed tickets/ticket47829_test.py::test_ticket47829_indirect_active_group_1 2.05
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47829_test.py:62 ############################################### INFO  lib389:ticket47829_test.py:63 ####### INFO  lib389:ticket47829_test.py:64 ####### add an Active group (G1) to an active group (G0). Then add active user to G1 INFO  lib389:ticket47829_test.py:65 ####### INFO  lib389:ticket47829_test.py:66 ############################################### INFO  lib389:ticket47829_test.py:172 add entry cn=active guy,cn=accounts,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=active group,cn=accounts,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:96 !!!!!!! cn=active guy,cn=accounts,cn=in,dc=example,dc=com: memberof->b'cn=active group,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:96 !!!!!!! cn=active guy,cn=accounts,cn=in,dc=example,dc=com: memberof->b'cn=active group,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:96 !!!!!!! cn=active guy,cn=accounts,cn=in,dc=example,dc=com: memberof->b'cn=indirect active group,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:96 !!!!!!! cn=active guy,cn=accounts,cn=in,dc=example,dc=com: memberof->b'cn=active group,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:96 !!!!!!! cn=active guy,cn=accounts,cn=in,dc=example,dc=com: memberof->b'cn=active group,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:172 delete entry cn=active guy,cn=accounts,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=active group,cn=accounts,cn=in,dc=example,dc=com
Passed tickets/ticket47829_test.py::test_ticket47829_indirect_active_group_2 1.07
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47829_test.py:62 ############################################### INFO  lib389:ticket47829_test.py:63 ####### INFO  lib389:ticket47829_test.py:64 ####### add an Active group (G1) to an active group (G0). Then add active user to G1. Then move active user to stage INFO  lib389:ticket47829_test.py:65 ####### INFO  lib389:ticket47829_test.py:66 ############################################### INFO  lib389:ticket47829_test.py:172 add entry cn=active guy,cn=accounts,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=active group,cn=accounts,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:96 !!!!!!! cn=active guy,cn=accounts,cn=in,dc=example,dc=com: memberof->b'cn=active group,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:96 !!!!!!! cn=active guy,cn=accounts,cn=in,dc=example,dc=com: memberof->b'cn=active group,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:96 !!!!!!! cn=active guy,cn=accounts,cn=in,dc=example,dc=com: memberof->b'cn=indirect active group,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:96 !!!!!!! cn=active guy,cn=accounts,cn=in,dc=example,dc=com: memberof->b'cn=active group,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:96 !!!!!!! cn=active guy,cn=accounts,cn=in,dc=example,dc=com: memberof->b'cn=active group,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:132 ######################### MODRDN cn=active guy ###################### INFO  lib389:ticket47829_test.py:132 ######################### MODRDN cn=active guy ######################
Passed tickets/ticket47829_test.py::test_ticket47829_indirect_active_group_3 1.30
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47829_test.py:62 ############################################### INFO  lib389:ticket47829_test.py:63 ####### INFO  lib389:ticket47829_test.py:64 ####### add an Active group (G1) to an active group (G0). Then add active user to G1. Then move active user to out of the scope INFO  lib389:ticket47829_test.py:65 ####### INFO  lib389:ticket47829_test.py:66 ############################################### INFO  lib389:ticket47829_test.py:172 add entry cn=active guy,cn=accounts,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=active group,cn=accounts,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:96 !!!!!!! cn=active guy,cn=accounts,cn=in,dc=example,dc=com: memberof->b'cn=active group,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:96 !!!!!!! cn=active guy,cn=accounts,cn=in,dc=example,dc=com: memberof->b'cn=active group,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:96 !!!!!!! cn=active guy,cn=accounts,cn=in,dc=example,dc=com: memberof->b'cn=indirect active group,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:96 !!!!!!! cn=active guy,cn=accounts,cn=in,dc=example,dc=com: memberof->b'cn=active group,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:96 !!!!!!! cn=active guy,cn=accounts,cn=in,dc=example,dc=com: memberof->b'cn=active group,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:132 ######################### MODRDN cn=active guy ###################### INFO  lib389:ticket47829_test.py:132 ######################### MODRDN cn=active guy ######################
Passed tickets/ticket47829_test.py::test_ticket47829_indirect_active_group_4 1.05
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47829_test.py:62 ############################################### INFO  lib389:ticket47829_test.py:63 ####### INFO  lib389:ticket47829_test.py:64 ####### add an Active group (G1) to an active group (G0). Then add stage user to G1. Then move user to active. Then move it back INFO  lib389:ticket47829_test.py:65 ####### INFO  lib389:ticket47829_test.py:66 ############################################### INFO  lib389:ticket47829_test.py:172 add entry cn=stage guy,cn=staged users,cn=provisioning,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=active group,cn=accounts,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:116 !!!!!!! cn=active group,cn=accounts,cn=in,dc=example,dc=com: member ->b'cn=stage guy,cn=staged users,cn=provisioning,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:116 !!!!!!! cn=indirect active group,cn=accounts,cn=in,dc=example,dc=com: member ->b'cn=active group,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:132 ######################### MODRDN cn=stage guy ###################### INFO  lib389:ticket47829_test.py:116 !!!!!!! cn=active group,cn=accounts,cn=in,dc=example,dc=com: member ->b'cn=stage guy,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:116 !!!!!!! cn=indirect active group,cn=accounts,cn=in,dc=example,dc=com: member ->b'cn=active group,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:96 !!!!!!! cn=stage guy,cn=accounts,cn=in,dc=example,dc=com: memberof->b'cn=active group,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:96 !!!!!!! cn=stage guy,cn=accounts,cn=in,dc=example,dc=com: memberof->b'cn=indirect active group,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:96 !!!!!!! cn=stage guy,cn=accounts,cn=in,dc=example,dc=com: memberof->b'cn=active group,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:132 ######################### MODRDN cn=stage guy ###################### INFO  lib389:ticket47829_test.py:116 !!!!!!! cn=indirect active group,cn=accounts,cn=in,dc=example,dc=com: member ->b'cn=active group,cn=accounts,cn=in,dc=example,dc=com'
Passed tickets/ticket47833_test.py::test_ticket47829_init 4.50
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed tickets/ticket47833_test.py::test_ticket47829_mod_stage_user_modrdn_stage_user_1 1.03
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47833_test.py:58 ############################################### INFO  lib389:ticket47833_test.py:59 ####### INFO  lib389:ticket47833_test.py:60 ####### add an Stage user to a Active group. Then move Stage user to Stage INFO  lib389:ticket47833_test.py:61 ####### INFO  lib389:ticket47833_test.py:62 ############################################### INFO  lib389:ticket47833_test.py:145 add entry cn=stage guy,cn=staged users,cn=provisioning,cn=in,dc=example,dc=com INFO  lib389:ticket47833_test.py:146 to group cn=active group,cn=accounts,cn=in,dc=example,dc=com INFO  lib389:ticket47833_test.py:112 !!!!!!! cn=active group,cn=accounts,cn=in,dc=example,dc=com: member ->b'cn=stage guy,cn=staged users,cn=provisioning,cn=in,dc=example,dc=com' INFO  lib389:ticket47833_test.py:128 ######################### MODRDN cn=xstage guy ###################### INFO  lib389:ticket47833_test.py:112 !!!!!!! cn=active group,cn=accounts,cn=in,dc=example,dc=com: member ->b'cn=stage guy,cn=staged users,cn=provisioning,cn=in,dc=example,dc=com'
Passed tickets/ticket47900_test.py::test_ticket47900 1.00
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47900_test.py:52 Creating Password Administator entry cn=passwd_admin,dc=example,dc=com... INFO  lib389:ticket47900_test.py:62 Configuring password policy... INFO  lib389:ticket47900_test.py:74 Add aci to allow password admin to add/update entries... INFO  lib389:ticket47900_test.py:87 Bind as the Password Administator (before activating)... INFO  lib389:ticket47900_test.py:101 Attempt to add entries with invalid passwords, these adds should fail... INFO  lib389:ticket47900_test.py:105 Create a regular user entry cn=Joe Schmo,dc=example,dc=com with password (2_Short)... INFO  lib389:ticket47900_test.py:111 Add failed as expected: password (2_Short) result (Constraint violation) INFO  lib389:ticket47900_test.py:105 Create a regular user entry cn=Joe Schmo,dc=example,dc=com with password (No_Number)... INFO  lib389:ticket47900_test.py:111 Add failed as expected: password (No_Number) result (Constraint violation) INFO  lib389:ticket47900_test.py:105 Create a regular user entry cn=Joe Schmo,dc=example,dc=com with password (N0Special)... INFO  lib389:ticket47900_test.py:111 Add failed as expected: password (N0Special) result (Constraint violation) INFO  lib389:ticket47900_test.py:105 Create a regular user entry cn=Joe Schmo,dc=example,dc=com with password ({SSHA}bBy8UdtPZwu8uZna9QOYG3Pr41RpIRVDl8wddw==)... INFO  lib389:ticket47900_test.py:111 Add failed as expected: password ({SSHA}bBy8UdtPZwu8uZna9QOYG3Pr41RpIRVDl8wddw==) result (Constraint violation) INFO  lib389:ticket47900_test.py:123 Activate the Password Administator... INFO  lib389:ticket47900_test.py:139 Create a regular user entry cn=Joe Schmo,dc=example,dc=com with password (2_Short)... INFO  lib389:ticket47900_test.py:142 Succesfully added entry (cn=Joe Schmo,dc=example,dc=com) INFO  lib389:ticket47900_test.py:139 Create a regular user entry cn=Joe Schmo,dc=example,dc=com with password (No_Number)... INFO  lib389:ticket47900_test.py:142 Succesfully added entry (cn=Joe Schmo,dc=example,dc=com) INFO  lib389:ticket47900_test.py:139 Create a regular user entry cn=Joe Schmo,dc=example,dc=com with password (N0Special)... INFO  lib389:ticket47900_test.py:142 Succesfully added entry (cn=Joe Schmo,dc=example,dc=com) INFO  lib389:ticket47900_test.py:139 Create a regular user entry cn=Joe Schmo,dc=example,dc=com with password ({SSHA}bBy8UdtPZwu8uZna9QOYG3Pr41RpIRVDl8wddw==)... INFO  lib389:ticket47900_test.py:142 Succesfully added entry (cn=Joe Schmo,dc=example,dc=com) INFO  lib389:ticket47900_test.py:155 Deactivate Password Administator and try invalid password updates... INFO  lib389:ticket47900_test.py:177 Password update failed as expected: password (2_Short) result (Constraint violation) INFO  lib389:ticket47900_test.py:177 Password update failed as expected: password (No_Number) result (Constraint violation) INFO  lib389:ticket47900_test.py:177 Password update failed as expected: password (N0Special) result (Constraint violation) INFO  lib389:ticket47900_test.py:177 Password update failed as expected: password ({SSHA}bBy8UdtPZwu8uZna9QOYG3Pr41RpIRVDl8wddw==) result (Constraint violation) INFO  lib389:ticket47900_test.py:188 Activate Password Administator and try updates again... INFO  lib389:ticket47900_test.py:205 Password update succeeded (2_Short) INFO  lib389:ticket47900_test.py:205 Password update succeeded (No_Number) INFO  lib389:ticket47900_test.py:205 Password update succeeded (N0Special) INFO  lib389:ticket47900_test.py:205 Password update succeeded ({SSHA}bBy8UdtPZwu8uZna9QOYG3Pr41RpIRVDl8wddw==)
Passed tickets/ticket47910_test.py::test_ticket47910_logconv_start_end_positive 0.49
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.utils:ticket47910_test.py:36 Diable access log buffering INFO  lib389.utils:ticket47910_test.py:39 Do a ldapsearch operation INFO  lib389.utils:ticket47910_test.py:42 sleep for sometime so that access log file get generated
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:ticket47910_test.py:79 Running test_ticket47910 - Execute logconv.pl -S -E with random values INFO  lib389.utils:ticket47910_test.py:81 taking current time with offset of 2 mins and formatting it to feed -S INFO  lib389.utils:ticket47910_test.py:85 taking current time with offset of 2 mins and formatting it to feed -E INFO  lib389.utils:ticket47910_test.py:89 Executing logconv.pl with -S and -E INFO  lib389.utils:ticket47910_test.py:61 Executing logconv.pl with -S current time and -E end time INFO  lib389.utils:ticket47910_test.py:63 /usr/bin/logconv.pl -S [20/Oct/2020:21:19:16] -E [20/Oct/2020:21:23:16] /var/log/dirsrv/slapd-standalone1/access INFO  lib389.utils:ticket47910_test.py:66 standard outputAccess Log Analyzer 8.2 Command: logconv.pl /var/log/dirsrv/slapd-standalone1/access Processing 1 Access Log(s)... [001] /var/log/dirsrv/slapd-standalone1/access size (bytes): 1136 Total Log Lines Analysed: 9 ----------- Access Log Output ------------ Start of Logs: 20/Oct/2020:21:19:16 End of Logs: 20/Oct/2020:21:21:15.526465963 Processed Log Time: 0 Hours, 1 Minutes, 59.526465536 Seconds Restarts: 1 Peak Concurrent Connections: 1 Total Operations: 4 Total Results: 4 Overall Performance: 100.0% Total Connections: 1 (0.01/sec) (0.50/min) - LDAP Connections: 1 (0.01/sec) (0.50/min) - LDAPI Connections: 0 (0.00/sec) (0.00/min) - LDAPS Connections: 0 (0.00/sec) (0.00/min) - StartTLS Extended Ops: 0 (0.00/sec) (0.00/min) Searches: 2 (0.02/sec) (1.00/min) Modifications: 1 (0.01/sec) (0.50/min) Adds: 0 (0.00/sec) (0.00/min) Deletes: 0 (0.00/sec) (0.00/min) Mod RDNs: 0 (0.00/sec) (0.00/min) Compares: 0 (0.00/sec) (0.00/min) Binds: 1 (0.01/sec) (0.50/min) Average wtime (wait time): 0.000174168 Average optime (op time): 0.025997450 Average etime (elapsed time): 0.026167919 Proxied Auth Operations: 0 Persistent Searches: 0 Internal Operations: 0 Entry Operations: 0 Extended Operations: 0 Abandoned Requests: 0 Smart Referrals Received: 0 VLV Operations: 0 VLV Unindexed Searches: 0 VLV Unindexed Components: 0 SORT Operations: 0 Entire Search Base Queries: 1 Paged Searches: 0 Unindexed Searches: 0 Unindexed Components: 1 Invalid Attribute Filters: 0 FDs Taken: 1 FDs Returned: 0 Highest FD Taken: 64 Broken Pipes: 0 Connections Reset By Peer: 0 Resource Unavailable: 0 Max BER Size Exceeded: 0 Binds: 1 Unbinds: 0 -------------------------------- - LDAP v2 Binds: 0 - LDAP v3 Binds: 1 - AUTOBINDs(LDAPI): 0 - SSL Client Binds: 0 - Failed SSL Client Binds: 0 - SASL Binds: 0 - Directory Manager Binds: 0 - Anonymous Binds: 0 Cleaning up temp files... Done. INFO  lib389.utils:ticket47910_test.py:67 standard errors
Passed tickets/ticket47910_test.py::test_ticket47910_logconv_start_end_negative 0.13
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:ticket47910_test.py:105 Running test_ticket47910 - Execute logconv.pl -S -E with starttime>endtime INFO  lib389.utils:ticket47910_test.py:107 taking current time with offset of 2 mins and formatting it to feed -S INFO  lib389.utils:ticket47910_test.py:111 taking current time with offset of 2 mins and formatting it to feed -E INFO  lib389.utils:ticket47910_test.py:115 Executing logconv.pl with -S and -E INFO  lib389.utils:ticket47910_test.py:61 Executing logconv.pl with -S current time and -E end time INFO  lib389.utils:ticket47910_test.py:63 /usr/bin/logconv.pl -S [20/Oct/2020:21:23:17] -E [20/Oct/2020:21:19:17] /var/log/dirsrv/slapd-standalone1/access INFO  lib389.utils:ticket47910_test.py:66 standard outputAccess Log Analyzer 8.2 Command: logconv.pl /var/log/dirsrv/slapd-standalone1/access Start time ([20/Oct/2020:21:23:17]) is greater than end time ([20/Oct/2020:21:19:17])! Cleaning up temp files... Done. INFO  lib389.utils:ticket47910_test.py:67 standard errors
Passed tickets/ticket47910_test.py::test_ticket47910_logconv_start_end_invalid 0.13
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:ticket47910_test.py:128 Running test_ticket47910 - Execute logconv.pl -S -E with invalid timestamp INFO  lib389.utils:ticket47910_test.py:129 Set start time and end time to invalid values INFO  lib389.utils:ticket47910_test.py:133 Executing logconv.pl with -S and -E INFO  lib389.utils:ticket47910_test.py:61 Executing logconv.pl with -S current time and -E end time INFO  lib389.utils:ticket47910_test.py:63 /usr/bin/logconv.pl -S invalid -E invalid /var/log/dirsrv/slapd-standalone1/access INFO  lib389.utils:ticket47910_test.py:66 standard outputAccess Log Analyzer 8.2 Command: logconv.pl /var/log/dirsrv/slapd-standalone1/access The date string (invalid) is invalid, exiting... Cleaning up temp files... Done. INFO  lib389.utils:ticket47910_test.py:67 standard errors
Passed tickets/ticket47910_test.py::test_ticket47910_logconv_noaccesslogs 0.15
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:ticket47910_test.py:147 Running test_ticket47910 - Execute logconv.pl without access logs INFO  lib389.utils:ticket47910_test.py:149 taking current time with offset of 2 mins and formatting it to feed -S INFO  lib389.utils:ticket47910_test.py:152 Executing logconv.pl with -S current time INFO  lib389.utils:ticket47910_test.py:154 /usr/bin/logconv.pl -S [20/Oct/2020:21:19:17] INFO  lib389.utils:ticket47910_test.py:157 standard outputThere are no access logs specified, or the tool options have not been used correctly! Cleaning up temp files... Done. INFO  lib389.utils:ticket47910_test.py:158 standard errors
Passed tickets/ticket47920_test.py::test_ticket47920_init 0.27
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed tickets/ticket47920_test.py::test_ticket47920_mod_readentry_ctrl 0.15
------------------------------Captured stdout call------------------------------
['final description']
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47920_test.py:65 ############################################### INFO  lib389:ticket47920_test.py:66 ####### INFO  lib389:ticket47920_test.py:67 ####### MOD: with a readentry control INFO  lib389:ticket47920_test.py:68 ####### INFO  lib389:ticket47920_test.py:69 ############################################### INFO  lib389:ticket47920_test.py:106 Check the initial value of the entry
Passed tickets/ticket47921_test.py::test_ticket47921 0.29
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket47921_test:ticket47921_test.py:81 Test complete
Passed tickets/ticket47927_test.py::test_ticket47927_init 4.97
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed tickets/ticket47927_test.py::test_ticket47927_one 0.27
-------------------------------Captured log call--------------------------------
CRITICAL tests.tickets.ticket47927_test:ticket47927_test.py:100 test_ticket47927_one: Failed (expected) to set the telephonenumber for cn=test_2,cn=enforced_container,dc=example,dc=com: Constraint violation CRITICAL tests.tickets.ticket47927_test:ticket47927_test.py:111 test_ticket47927_one: Failed (expected) to set the telephonenumber for cn=test_3,cn=excluded_container,dc=example,dc=com: Constraint violation
Passed tickets/ticket47927_test.py::test_ticket47927_two 3.61
No log output captured.
Passed tickets/ticket47927_test.py::test_ticket47927_three 0.29
-------------------------------Captured log call--------------------------------
CRITICAL tests.tickets.ticket47927_test:ticket47927_test.py:151 test_ticket47927_three: Failed (expected) to set the telephonenumber for cn=test_2,cn=enforced_container,dc=example,dc=com: Constraint violation CRITICAL tests.tickets.ticket47927_test:ticket47927_test.py:158 test_ticket47927_three: success to set the telephonenumber for cn=test_3,cn=excluded_container,dc=example,dc=com
Passed tickets/ticket47927_test.py::test_ticket47927_four 0.01
-------------------------------Captured log call--------------------------------
CRITICAL tests.tickets.ticket47927_test:ticket47927_test.py:176 test_ticket47927_four: success to set the telephonenumber for cn=test_3,cn=excluded_container,dc=example,dc=com CRITICAL tests.tickets.ticket47927_test:ticket47927_test.py:198 test_ticket47927_four: Failed (expected) to set the telephonenumber for cn=test_2,cn=enforced_container,dc=example,dc=com: Constraint violation
Passed tickets/ticket47927_test.py::test_ticket47927_five 4.05
No log output captured.
Passed tickets/ticket47927_test.py::test_ticket47927_six 0.04
-------------------------------Captured log call--------------------------------
CRITICAL tests.tickets.ticket47927_test:ticket47927_test.py:240 test_ticket47927_six: Failed (expected) to set the telephonenumber for cn=test_2,cn=enforced_container,dc=example,dc=com: Constraint violation CRITICAL tests.tickets.ticket47927_test:ticket47927_test.py:247 test_ticket47927_six: success to set the telephonenumber for cn=test_3,cn=excluded_container,dc=example,dc=com CRITICAL tests.tickets.ticket47927_test:ticket47927_test.py:256 test_ticket47927_six: success to set the telephonenumber for cn=test_4,cn=excluded_bis_container,dc=example,dc=com
Passed tickets/ticket47931_test.py::test_ticket47931 36.95
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
------------------------------Captured stderr call------------------------------
Exception in thread Thread-3: Traceback (most recent call last): File "/usr/lib64/python3.8/threading.py", line 932, in _bootstrap_inner self.run() File "/export/tests/tickets/ticket47931_test.py", line 36, in run conn.set_option(ldap.OPT_TIMEOUT, self.timeout) File "/usr/local/lib/python3.8/site-packages/lib389/__init__.py", line 180, in inner return f(*args, **kwargs) File "/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py", line 937, in set_option return self._ldap_call(self._l.set_option,option,invalue) File "/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py", line 362, in __getattr__ raise AttributeError('%s has no attribute %s' % ( AttributeError: DirSrv has no attribute '_l'
-------------------------------Captured log call--------------------------------
INFO  lib389:backend.py:80 List backend with suffix=dc=deadlock INFO  lib389:backend.py:290 Creating a local backend INFO  lib389:backend.py:76 List backend cn=deadlock,cn=ldbm database,cn=plugins,cn=config INFO  lib389:__init__.py:1772 Found entry dn: cn=deadlock,cn=ldbm database,cn=plugins,cn=config cn: deadlock nsslapd-cachememsize: 512000 nsslapd-cachesize: -1 nsslapd-directory: /var/lib/dirsrv/slapd-standalone1/db/deadlock nsslapd-dncachememsize: 16777216 nsslapd-readonly: off nsslapd-require-index: off nsslapd-require-internalop-index: off nsslapd-suffix: dc=deadlock objectClass: top objectClass: extensibleObject objectClass: nsBackendInstance INFO  lib389:mappingTree.py:154 Entry dn: cn="dc=deadlock",cn=mapping tree,cn=config cn: dc=deadlock nsslapd-backend: deadlock nsslapd-state: backend objectclass: top objectclass: extensibleObject objectclass: nsMappingTree INFO  lib389:__init__.py:1772 Found entry dn: cn=dc\3Ddeadlock,cn=mapping tree,cn=config cn: dc=deadlock nsslapd-backend: deadlock nsslapd-state: backend objectClass: top objectClass: extensibleObject objectClass: nsMappingTree INFO  tests.tickets.ticket47931_test:ticket47931_test.py:142 Adding members to the group... INFO  tests.tickets.ticket47931_test:ticket47931_test.py:158 Finished adding members to the group. INFO  tests.tickets.ticket47931_test:ticket47931_test.py:164 Test complete
Passed tickets/ticket47953_test.py::test_ticket47953 4.27
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  lib389:tasks.py:498 Import task import_10202020_212245 for file /var/lib/dirsrv/slapd-standalone1/ldif/ticket47953.ldif completed successfully
Passed tickets/ticket47963_test.py::test_ticket47963 8.05
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket47963_test:ticket47963_test.py:145 Test complete
Passed tickets/ticket47970_test.py::test_ticket47970 0.29
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed tickets/ticket47973_test.py::test_ticket47973 2.43
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed tickets/ticket47976_test.py::test_ticket47976_init 4.97
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed tickets/ticket47976_test.py::test_ticket47976_1 3.53
No log output captured.
Passed tickets/ticket47976_test.py::test_ticket47976_2 4.32
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket47976_test:ticket47976_test.py:99 Test complete INFO  tests.tickets.ticket47976_test:ticket47976_test.py:104 Export LDIF file... INFO  lib389:tasks.py:567 Export task export_10202020_212342 for file /var/lib/dirsrv/slapd-standalone1/ldif/export.ldif completed successfully INFO  tests.tickets.ticket47976_test:ticket47976_test.py:115 Import LDIF file... INFO  lib389:tasks.py:498 Import task import_10202020_212344 for file /var/lib/dirsrv/slapd-standalone1/ldif/export.ldif completed successfully
Passed tickets/ticket47976_test.py::test_ticket47976_3 0.02
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket47976_test:ticket47976_test.py:131 Testing if the delete will hang or not INFO  tests.tickets.ticket47976_test:ticket47976_test.py:150 user0 was correctly deleted INFO  tests.tickets.ticket47976_test:ticket47976_test.py:150 user1 was correctly deleted
Passed tickets/ticket47980_test.py::test_ticket47980 3.20
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed tickets/ticket47981_test.py::test_ticket47981 3.63
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  lib389:backend.py:80 List backend with suffix=o=netscaperoot INFO  lib389:backend.py:290 Creating a local backend INFO  lib389:backend.py:76 List backend cn=netscaperoot,cn=ldbm database,cn=plugins,cn=config INFO  lib389:__init__.py:1772 Found entry dn: cn=netscaperoot,cn=ldbm database,cn=plugins,cn=config cn: netscaperoot nsslapd-cachememsize: 512000 nsslapd-cachesize: -1 nsslapd-directory: /var/lib/dirsrv/slapd-standalone1/db/netscaperoot nsslapd-dncachememsize: 16777216 nsslapd-readonly: off nsslapd-require-index: off nsslapd-require-internalop-index: off nsslapd-suffix: o=netscaperoot objectClass: top objectClass: extensibleObject objectClass: nsBackendInstance INFO  lib389:mappingTree.py:154 Entry dn: cn="o=netscaperoot",cn=mapping tree,cn=config cn: o=netscaperoot nsslapd-backend: netscaperoot nsslapd-state: backend objectclass: top objectclass: extensibleObject objectclass: nsMappingTree INFO  lib389:__init__.py:1772 Found entry dn: cn=o\3Dnetscaperoot,cn=mapping tree,cn=config cn: o=netscaperoot nsslapd-backend: netscaperoot nsslapd-state: backend objectClass: top objectClass: extensibleObject objectClass: nsMappingTree
Passed tickets/ticket48005_test.py::test_ticket48005_setup 12.15
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
------------------------------Captured stderr call------------------------------
grep: /etc/sysconfig/dirsrv: No such file or directory grep: /etc/sysconfig/dirsrv.systemd: No such file or directory ls: cannot access '/var/lib/dirsrv/slapd-standalone1/ldif/ticket48005.ldif': No such file or directory
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48005_test:ticket48005_test.py:31 Ticket 48005 setup... INFO  tests.tickets.ticket48005_test:ticket48005_test.py:41 No ulimit -c in /etc/sysconfig/dirsrv INFO  tests.tickets.ticket48005_test:ticket48005_test.py:42 Adding it INFO  tests.tickets.ticket48005_test:ticket48005_test.py:50 No LimitCORE in /etc/sysconfig/dirsrv.systemd INFO  tests.tickets.ticket48005_test:ticket48005_test.py:51 Adding it INFO  tests.tickets.ticket48005_test:ticket48005_test.py:64 dbgen_prog: /bin/dbgen.pl INFO  tests.tickets.ticket48005_test:ticket48005_test.py:70 We have 10106 entries. ERROR  lib389:tasks.py:495 Error: import task import_10202020_212500 for file /var/lib/dirsrv/slapd-standalone1/ldif/ticket48005.ldif exited with -23 INFO  tests.tickets.ticket48005_test:ticket48005_test.py:75 Importing /var/lib/dirsrv/slapd-standalone1/ldif/ticket48005.ldif complete.
Passed tickets/ticket48005_test.py::test_ticket48005_memberof 12.67
------------------------------Captured stderr call------------------------------
ls: cannot access '/var/log/dirsrv/slapd-standalone1/core*': No such file or directory
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48005_test:ticket48005_test.py:86 Ticket 48005 memberof test... INFO  lib389:tasks.py:877 fixupMemberOf task fixupmemberof_10202020_212512 for basedn dc=example,dc=com completed successfully INFO  tests.tickets.ticket48005_test:ticket48005_test.py:110 No core files are found INFO  tests.tickets.ticket48005_test:ticket48005_test.py:119 Ticket 48005 memberof test complete
Passed tickets/ticket48005_test.py::test_ticket48005_automember 19.96
------------------------------Captured stderr call------------------------------
ls: cannot access '/var/log/dirsrv/slapd-standalone1/core*': No such file or directory ls: cannot access '/var/log/dirsrv/slapd-standalone1/core*': No such file or directory ls: cannot access '/var/log/dirsrv/slapd-standalone1/core*': No such file or directory
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48005_test:ticket48005_test.py:138 Ticket 48005 automember test... INFO  tests.tickets.ticket48005_test:ticket48005_test.py:143 Adding automember config INFO  lib389:tasks.py:986 Automember Rebuild Membership task(task-10202020_212524) completedsuccessfully INFO  tests.tickets.ticket48005_test:ticket48005_test.py:176 No core files are found INFO  lib389:tasks.py:1039 Automember Export Updates task (task-10202020_212528) completed successfully INFO  tests.tickets.ticket48005_test:ticket48005_test.py:198 No core files are found INFO  lib389:tasks.py:1087 Automember Map Updates task (task-10202020_212531) completed successfully INFO  tests.tickets.ticket48005_test:ticket48005_test.py:222 No core files are found INFO  tests.tickets.ticket48005_test:ticket48005_test.py:231 Ticket 48005 automember test complete
Passed tickets/ticket48005_test.py::test_ticket48005_syntaxvalidate 3.46
------------------------------Captured stderr call------------------------------
ls: cannot access '/var/log/dirsrv/slapd-standalone1/core*': No such file or directory
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48005_test:ticket48005_test.py:241 Ticket 48005 syntax validate test... INFO  lib389:tasks.py:1255 Syntax Validate task (task-10202020_212540) completed successfully INFO  tests.tickets.ticket48005_test:ticket48005_test.py:261 No core files are found INFO  tests.tickets.ticket48005_test:ticket48005_test.py:265 Ticket 48005 syntax validate test complete
Passed tickets/ticket48005_test.py::test_ticket48005_schemareload 2.87
------------------------------Captured stderr call------------------------------
ls: cannot access '/var/log/dirsrv/slapd-standalone1/core*': No such file or directory
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48005_test:ticket48005_test.py:334 Ticket 48005 schema reload test... INFO  lib389:tasks.py:1169 Schema Reload task (task-10202020_212549) completed successfully INFO  tests.tickets.ticket48005_test:ticket48005_test.py:354 No core files are found INFO  tests.tickets.ticket48005_test:ticket48005_test.py:358 Ticket 48005 schema reload test complete
Passed tickets/ticket48013_test.py::test_ticket48013 0.78
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48013_test:ticket48013_test.py:77 Testing cookie: # INFO  tests.tickets.ticket48013_test:ticket48013_test.py:84 Invalid cookie correctly rejected: Invalid session cookie INFO  tests.tickets.ticket48013_test:ticket48013_test.py:77 Testing cookie: ## INFO  tests.tickets.ticket48013_test:ticket48013_test.py:84 Invalid cookie correctly rejected: Invalid session cookie INFO  tests.tickets.ticket48013_test:ticket48013_test.py:77 Testing cookie: a#a#a INFO  tests.tickets.ticket48013_test:ticket48013_test.py:84 Invalid cookie correctly rejected: Invalid session cookie INFO  tests.tickets.ticket48013_test:ticket48013_test.py:77 Testing cookie: a#a#1 INFO  tests.tickets.ticket48013_test:ticket48013_test.py:84 Invalid cookie correctly rejected: Invalid session cookie INFO  tests.tickets.ticket48013_test:ticket48013_test.py:88 Test complete
Passed tickets/ticket48026_test.py::test_ticket48026 4.76
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48026_test:ticket48026_test.py:114 Test complete
Passed tickets/ticket48109_test.py::test_ticket48109 27.39
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48109_test:ticket48109_test.py:32 Test case 0 INFO  tests.tickets.ticket48109_test:ticket48109_test.py:77 match: conn=1 op=3 INFO  tests.tickets.ticket48109_test:ticket48109_test.py:85 l1: [20/Oct/2020:21:26:22.735420002 -0400] conn=1 op=3 RESULT err=0 tag=101 nentries=1 wtime=0.000238286 optime=0.000342110 etime=0.000577140 INFO  tests.tickets.ticket48109_test:ticket48109_test.py:88 match: nentires=1 INFO  tests.tickets.ticket48109_test:ticket48109_test.py:93 Entry uid=a* found. INFO  tests.tickets.ticket48109_test:ticket48109_test.py:100 Test case 0 - OK - substr index used INFO  tests.tickets.ticket48109_test:ticket48109_test.py:119 Test case 1 INFO  tests.tickets.ticket48109_test:ticket48109_test.py:163 match: conn=1 op=3 INFO  tests.tickets.ticket48109_test:ticket48109_test.py:171 l1: [20/Oct/2020:21:26:22.735420002 -0400] conn=1 op=3 RESULT err=0 tag=101 nentries=1 wtime=0.000238286 optime=0.000342110 etime=0.000577140 INFO  tests.tickets.ticket48109_test:ticket48109_test.py:174 match: nentires=1 INFO  tests.tickets.ticket48109_test:ticket48109_test.py:179 Entry uid=*b found. INFO  tests.tickets.ticket48109_test:ticket48109_test.py:186 Test case 1 - OK - substr index used INFO  tests.tickets.ticket48109_test:ticket48109_test.py:208 Test case 2 INFO  tests.tickets.ticket48109_test:ticket48109_test.py:259 match: conn=1 op=3 INFO  tests.tickets.ticket48109_test:ticket48109_test.py:267 l1: [20/Oct/2020:21:26:22.735420002 -0400] conn=1 op=3 RESULT err=0 tag=101 nentries=1 wtime=0.000238286 optime=0.000342110 etime=0.000577140 INFO  tests.tickets.ticket48109_test:ticket48109_test.py:270 match: nentires=1 INFO  tests.tickets.ticket48109_test:ticket48109_test.py:275 Entry uid=c* found. INFO  tests.tickets.ticket48109_test:ticket48109_test.py:282 Test case 2-1 - OK - correct substr index used INFO  tests.tickets.ticket48109_test:ticket48109_test.py:294 match: conn=1 op=4 INFO  tests.tickets.ticket48109_test:ticket48109_test.py:302 l1: [20/Oct/2020:21:26:40.888773573 -0400] conn=1 op=4 RESULT err=0 tag=101 nentries=1 wtime=0.000176220 optime=0.000235373 etime=0.000409172 INFO  tests.tickets.ticket48109_test:ticket48109_test.py:305 match: nentires=1 INFO  tests.tickets.ticket48109_test:ticket48109_test.py:310 Entry uid=*2 found. INFO  tests.tickets.ticket48109_test:ticket48109_test.py:317 Test case 2-2 - OK - correct substr index used INFO  tests.tickets.ticket48109_test:ticket48109_test.py:331 Testcase PASSED
Passed tickets/ticket48170_test.py::test_ticket48170 0.01
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48170_test:ticket48170_test.py:29 Index update correctly rejected INFO  tests.tickets.ticket48170_test:ticket48170_test.py:36 Test complete
Passed tickets/ticket48194_test.py::test_init 6.59
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket48194_test.py:40 ############################################### INFO  lib389:ticket48194_test.py:41 ####### Testing Ticket 48194 - harden the list of ciphers available by default INFO  lib389:ticket48194_test.py:42 ############################################### INFO  lib389.utils:ticket48194_test.py:57 ######################### enable SSL in the directory server with all ciphers ######################
Passed tickets/ticket48194_test.py::test_run_0 4.72
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket48194_test.py:40 ############################################### INFO  lib389:ticket48194_test.py:41 ####### Test Case 1 - Check the ciphers availability for "+all"; allowWeakCipher: on INFO  lib389:ticket48194_test.py:42 ############################################### INFO  lib389.utils:ticket48194_test.py:131 ######################### Restarting the server ###################### INFO  lib389.utils:ticket48194_test.py:86 Testing DES-CBC3-SHA -- expect to handshake successfully INFO  lib389.utils:ticket48194_test.py:92 Running cmdline: /usr/bin/openssl s_client -connect localhost:63601 -cipher DES-CBC3-SHA INFO  lib389.utils:ticket48194_test.py:105 Found: b'New, TLSv1.3, Cipher is TLS_AES_128_GCM_SHA256\n' INFO  lib389.utils:ticket48194_test.py:86 Testing AES256-SHA256 -- expect to handshake successfully INFO  lib389.utils:ticket48194_test.py:92 Running cmdline: /usr/bin/openssl s_client -connect localhost:63601 -cipher AES256-SHA256 INFO  lib389.utils:ticket48194_test.py:105 Found: b'New, TLSv1.3, Cipher is TLS_AES_128_GCM_SHA256\n'
Passed tickets/ticket48194_test.py::test_run_3 5.46
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket48194_test.py:40 ############################################### INFO  lib389:ticket48194_test.py:41 ####### Test Case 4 - Check the ciphers availability for "-all" INFO  lib389:ticket48194_test.py:42 ############################################### INFO  lib389.utils:ticket48194_test.py:199 ######################### Restarting the server ###################### INFO  lib389.utils:ticket48194_test.py:86 Testing DES-CBC3-SHA -- expect to handshake failed INFO  lib389.utils:ticket48194_test.py:92 Running cmdline: /usr/bin/openssl s_client -connect localhost:63601 -cipher DES-CBC3-SHA INFO  lib389.utils:ticket48194_test.py:105 Found: b'New, (NONE), Cipher is (NONE)\n' INFO  lib389.utils:ticket48194_test.py:86 Testing AES256-SHA256 -- expect to handshake failed INFO  lib389.utils:ticket48194_test.py:92 Running cmdline: /usr/bin/openssl s_client -connect localhost:63601 -cipher AES256-SHA256 INFO  lib389.utils:ticket48194_test.py:105 Found: b'New, (NONE), Cipher is (NONE)\n'
Passed tickets/ticket48194_test.py::test_run_9 6.64
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket48194_test.py:40 ############################################### INFO  lib389:ticket48194_test.py:41 ####### Test Case 10 - Check no nsSSL3Ciphers (default setting) with no errorlog-level & allowWeakCipher on INFO  lib389:ticket48194_test.py:42 ############################################### INFO  lib389.utils:ticket48194_test.py:316 ######################### Restarting the server ###################### INFO  lib389.utils:ticket48194_test.py:86 Testing DES-CBC3-SHA -- expect to handshake successfully INFO  lib389.utils:ticket48194_test.py:92 Running cmdline: /usr/bin/openssl s_client -connect localhost:63601 -cipher DES-CBC3-SHA INFO  lib389.utils:ticket48194_test.py:105 Found: b'New, TLSv1.3, Cipher is TLS_AES_128_GCM_SHA256\n' INFO  lib389.utils:ticket48194_test.py:86 Testing AES256-SHA256 -- expect to handshake successfully INFO  lib389.utils:ticket48194_test.py:92 Running cmdline: /usr/bin/openssl s_client -connect localhost:63601 -cipher AES256-SHA256 INFO  lib389.utils:ticket48194_test.py:105 Found: b'New, TLSv1.3, Cipher is TLS_AES_128_GCM_SHA256\n'
Passed tickets/ticket48194_test.py::test_run_11 5.46
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket48194_test.py:40 ############################################### INFO  lib389:ticket48194_test.py:41 ####### Test Case 12 - Check nsSSL3Ciphers: +fortezza, which is not supported INFO  lib389:ticket48194_test.py:42 ############################################### INFO  lib389.utils:ticket48194_test.py:337 ######################### Restarting the server ###################### INFO  lib389.utils:ticket48194_test.py:86 Testing DES-CBC3-SHA -- expect to handshake failed INFO  lib389.utils:ticket48194_test.py:92 Running cmdline: /usr/bin/openssl s_client -connect localhost:63601 -cipher DES-CBC3-SHA INFO  lib389.utils:ticket48194_test.py:105 Found: b'New, (NONE), Cipher is (NONE)\n' INFO  lib389.utils:ticket48194_test.py:86 Testing AES256-SHA256 -- expect to handshake failed INFO  lib389.utils:ticket48194_test.py:92 Running cmdline: /usr/bin/openssl s_client -connect localhost:63601 -cipher AES256-SHA256 INFO  lib389.utils:ticket48194_test.py:105 Found: b'New, (NONE), Cipher is (NONE)\n'
Passed tickets/ticket48212_test.py::test_ticket48212 14.02
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
------------------------------Captured stderr call------------------------------
[20/Oct/2020:21:28:20.693397868 -0400] - INFO - ldbm_instance_config_cachememsize_set - force a minimal value 512000 [20/Oct/2020:21:28:20.704511561 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/givenName.db: ok [20/Oct/2020:21:28:20.708035577 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/aci.db: ok [20/Oct/2020:21:28:20.716751618 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/sn.db: ok [20/Oct/2020:21:28:20.719896276 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/numsubordinates.db: ok [20/Oct/2020:21:28:20.722523505 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/entryusn.db: ok [20/Oct/2020:21:28:20.725428830 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/nsuniqueid.db: ok [20/Oct/2020:21:28:20.729328404 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/ancestorid.db: ok [20/Oct/2020:21:28:20.732496906 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/parentid.db: ok [20/Oct/2020:21:28:20.736743765 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/mail.db: ok [20/Oct/2020:21:28:20.741411756 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/cn.db: ok [20/Oct/2020:21:28:20.744648661 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/uid.db: ok [20/Oct/2020:21:28:20.749487389 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/entryrdn.db: ok [20/Oct/2020:21:28:20.756436466 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/id2entry.db: ok [20/Oct/2020:21:28:20.760958792 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/objectclass.db: ok [20/Oct/2020:21:28:24.353507972 -0400] - INFO - ldbm_instance_config_cachememsize_set - force a minimal value 512000 [20/Oct/2020:21:28:24.362953528 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/givenName.db: ok [20/Oct/2020:21:28:24.366137249 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/aci.db: ok [20/Oct/2020:21:28:24.371272952 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/sn.db: ok [20/Oct/2020:21:28:24.374191152 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/numsubordinates.db: ok [20/Oct/2020:21:28:24.377091951 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/entryusn.db: ok [20/Oct/2020:21:28:24.380302791 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/nsuniqueid.db: ok [20/Oct/2020:21:28:24.383383969 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/ancestorid.db: ok [20/Oct/2020:21:28:24.386485089 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/parentid.db: ok [20/Oct/2020:21:28:24.390740819 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/mail.db: ok [20/Oct/2020:21:28:24.395353922 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/cn.db: ok [20/Oct/2020:21:28:24.398480217 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/uid.db: ok [20/Oct/2020:21:28:24.402590751 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/entryrdn.db: ok [20/Oct/2020:21:28:24.411665167 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/id2entry.db: ok [20/Oct/2020:21:28:24.416177532 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/objectclass.db: ok [20/Oct/2020:21:28:24.419384386 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/uidnumber.db: ok [20/Oct/2020:21:28:27.836891634 -0400] - INFO - ldbm_instance_config_cachememsize_set - force a minimal value 512000 [20/Oct/2020:21:28:27.845886916 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/givenName.db: ok [20/Oct/2020:21:28:27.849489748 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/aci.db: ok [20/Oct/2020:21:28:27.853878431 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/sn.db: ok [20/Oct/2020:21:28:27.856943262 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/numsubordinates.db: ok [20/Oct/2020:21:28:27.860976062 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/entryusn.db: ok [20/Oct/2020:21:28:27.864182656 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/nsuniqueid.db: ok [20/Oct/2020:21:28:27.867431154 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/ancestorid.db: ok [20/Oct/2020:21:28:27.870502263 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/parentid.db: ok [20/Oct/2020:21:28:27.874210482 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/mail.db: ok [20/Oct/2020:21:28:27.878762164 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/cn.db: ok [20/Oct/2020:21:28:27.881768175 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/uid.db: ok [20/Oct/2020:21:28:27.885708037 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/entryrdn.db: ok [20/Oct/2020:21:28:27.892501191 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/id2entry.db: ok [20/Oct/2020:21:28:27.899211674 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/objectclass.db: ok [20/Oct/2020:21:28:27.903291147 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/uidnumber.db: ok [20/Oct/2020:21:28:31.261171271 -0400] - INFO - ldbm_instance_config_cachememsize_set - force a minimal value 512000 [20/Oct/2020:21:28:31.271417286 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/givenName.db: ok [20/Oct/2020:21:28:31.274883418 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/aci.db: ok [20/Oct/2020:21:28:31.280233428 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/sn.db: ok [20/Oct/2020:21:28:31.284015238 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/numsubordinates.db: ok [20/Oct/2020:21:28:31.287120580 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/entryusn.db: ok [20/Oct/2020:21:28:31.291007195 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/nsuniqueid.db: ok [20/Oct/2020:21:28:31.299332012 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/ancestorid.db: ok [20/Oct/2020:21:28:31.303648671 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/parentid.db: ok [20/Oct/2020:21:28:31.308065826 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/mail.db: ok [20/Oct/2020:21:28:31.313397132 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/cn.db: ok [20/Oct/2020:21:28:31.317719622 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/uid.db: ok [20/Oct/2020:21:28:31.321668826 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/entryrdn.db: ok [20/Oct/2020:21:28:31.327478290 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/id2entry.db: ok [20/Oct/2020:21:28:31.334076567 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/objectclass.db: ok [20/Oct/2020:21:28:31.338869678 -0400] - INFO - dbverify_ext - /var/lib/dirsrv/slapd-standalone1/db/userRoot/uidnumber.db: ok
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket48212_test.py:70 Bind as cn=Directory Manager INFO  lib389:ticket48212_test.py:83 ######################### Import Test data (/var/lib/dirsrv/slapd-standalone1/ldif/example1k_posix.ldif) ###################### INFO  lib389:tasks.py:498 Import task import_10202020_212817 for file /var/lib/dirsrv/slapd-standalone1/ldif/example1k_posix.ldif completed successfully INFO  lib389:ticket48212_test.py:19 +++++ dbverify +++++ INFO  lib389:ticket48212_test.py:23 Running /usr/sbin/dbverify -Z standalone1 -V INFO  lib389:ticket48212_test.py:43 dbverify passed INFO  lib389:ticket48212_test.py:92 ######################### Add index by uidnumber ###################### INFO  lib389:ticket48212_test.py:101 ######################### reindexing... ###################### INFO  lib389:ticket48212_test.py:47 +++++ reindex uidnumber +++++ INFO  lib389:tasks.py:798 Index task index_attrs_10202020_212821 completed successfully INFO  lib389:ticket48212_test.py:19 +++++ dbverify +++++ INFO  lib389:ticket48212_test.py:23 Running /usr/sbin/dbverify -Z standalone1 -V INFO  lib389:ticket48212_test.py:43 dbverify passed INFO  lib389:ticket48212_test.py:106 ######################### Add nsMatchingRule ###################### INFO  lib389:ticket48212_test.py:112 ######################### reindexing... ###################### INFO  lib389:ticket48212_test.py:47 +++++ reindex uidnumber +++++ INFO  lib389:tasks.py:798 Index task index_attrs_10202020_212824 completed successfully INFO  lib389:ticket48212_test.py:19 +++++ dbverify +++++ INFO  lib389:ticket48212_test.py:23 Running /usr/sbin/dbverify -Z standalone1 -V INFO  lib389:ticket48212_test.py:43 dbverify passed INFO  lib389:ticket48212_test.py:117 ######################### Delete nsMatchingRule ###################### INFO  lib389:ticket48212_test.py:47 +++++ reindex uidnumber +++++ INFO  lib389:tasks.py:798 Index task index_attrs_10202020_212828 completed successfully INFO  lib389:ticket48212_test.py:19 +++++ dbverify +++++ INFO  lib389:ticket48212_test.py:23 Running /usr/sbin/dbverify -Z standalone1 -V INFO  lib389:ticket48212_test.py:43 dbverify passed
Passed tickets/ticket48214_test.py::test_ticket48214_run 0.08
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket48214_test.py:83 Bind as cn=Directory Manager INFO  lib389:ticket48214_test.py:86 ######################### Out of Box ###################### INFO  lib389:ticket48214_test.py:44 +++++ Check Max Ber Size +++++ INFO  lib389:ticket48214_test.py:18 +++++ Get maxbersize from dse.ldif +++++ INFO  lib389:ticket48214_test.py:21 Run CMD: egrep nsslapd-maxbersize /etc/dirsrv/slapd-standalone1/dse.ldif INFO  lib389:ticket48214_test.py:28 Empty: INFO  lib389:ticket48214_test.py:50 No nsslapd-maxbersize found in dse.ldif INFO  lib389:ticket48214_test.py:63 ldapsearch returned nsslapd-maxbersize: b'2097152' INFO  lib389:ticket48214_test.py:72 Checking 2097152 vs 2097152 INFO  lib389:ticket48214_test.py:89 ######################### Add nsslapd-maxbersize: 0 ###################### INFO  lib389:ticket48214_test.py:44 +++++ Check Max Ber Size +++++ INFO  lib389:ticket48214_test.py:18 +++++ Get maxbersize from dse.ldif +++++ INFO  lib389:ticket48214_test.py:21 Run CMD: egrep nsslapd-maxbersize /etc/dirsrv/slapd-standalone1/dse.ldif INFO  lib389:ticket48214_test.py:35 Right format - nsslapd-maxbersize: 0 INFO  lib389:ticket48214_test.py:52 nsslapd-maxbersize: 0 INFO  lib389:ticket48214_test.py:63 ldapsearch returned nsslapd-maxbersize: b'2097152' INFO  lib389:ticket48214_test.py:72 Checking 2097152 vs 2097152 INFO  lib389:ticket48214_test.py:93 ######################### Add nsslapd-maxbersize: 10000 ###################### INFO  lib389:ticket48214_test.py:44 +++++ Check Max Ber Size +++++ INFO  lib389:ticket48214_test.py:18 +++++ Get maxbersize from dse.ldif +++++ INFO  lib389:ticket48214_test.py:21 Run CMD: egrep nsslapd-maxbersize /etc/dirsrv/slapd-standalone1/dse.ldif INFO  lib389:ticket48214_test.py:35 Right format - nsslapd-maxbersize: 10000 INFO  lib389:ticket48214_test.py:55 nsslapd-maxbersize: 10000 INFO  lib389:ticket48214_test.py:63 ldapsearch returned nsslapd-maxbersize: b'10000' INFO  lib389:ticket48214_test.py:98 ticket48214 was successfully verified.
Passed tickets/ticket48228_test.py::test_ticket48228_test_global_policy 13.98
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed tickets/ticket48233_test.py::test_ticket48233 5.55
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48233_test:ticket48233_test.py:54 Test complete
Passed tickets/ticket48234_test.py::test_ticket48234 0.61
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48234_test:ticket48234_test.py:35 Bind as root DN INFO  tests.tickets.ticket48234_test:ticket48234_test.py:46 Add aci which contains extensible filter. INFO  tests.tickets.ticket48234_test:ticket48234_test.py:58 Add entries ... INFO  tests.tickets.ticket48234_test:ticket48234_test.py:61 adding OU0 under dc=example,dc=com... INFO  tests.tickets.ticket48234_test:ticket48234_test.py:64 adding outest under ou=OU0,dc=example,dc=com... INFO  tests.tickets.ticket48234_test:ticket48234_test.py:61 adding OU1 under dc=example,dc=com... INFO  tests.tickets.ticket48234_test:ticket48234_test.py:64 adding outest under ou=OU1,dc=example,dc=com... INFO  tests.tickets.ticket48234_test:ticket48234_test.py:69 adding admin under ou=outest,ou=OU0,dc=example,dc=com... INFO  tests.tickets.ticket48234_test:ticket48234_test.py:69 adding admin under ou=outest,ou=OU1,dc=example,dc=com... INFO  tests.tickets.ticket48234_test:ticket48234_test.py:73 Bind as user cn=admin,ou=outest,ou=OU1,dc=example,dc=com INFO  tests.tickets.ticket48234_test:ticket48234_test.py:92 Test complete
Passed tickets/ticket48252_test.py::test_ticket48252_setup 0.24
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed tickets/ticket48252_test.py::test_ticket48252_run_0 10.96
------------------------------Captured stderr call------------------------------
[20/Oct/2020:21:29:28.960715879 -0400] - INFO - slapd_exemode_db2index - Backend Instance: userRoot [20/Oct/2020:21:29:28.966793339 -0400] - INFO - ldbm_instance_config_cachememsize_set - force a minimal value 512000 [20/Oct/2020:21:29:28.974830447 -0400] - INFO - bdb_instance_start - Import is running with nsslapd-db-private-import-mem on; No other process is allowed to access the database [20/Oct/2020:21:29:28.978445299 -0400] - INFO - check_and_set_import_cache - pagesize: 4096, available bytes 7405436928, process usage 23011328 [20/Oct/2020:21:29:28.981800949 -0400] - INFO - check_and_set_import_cache - Import allocates 2892748KB import cache. [20/Oct/2020:21:29:29.159784919 -0400] - INFO - bdb_db2index - userRoot: Indexing attribute: cn [20/Oct/2020:21:29:29.163435824 -0400] - ERR - libdb - BDB1566 txn_checkpoint interface requires an environment configured for the transaction subsystem [20/Oct/2020:21:29:29.166856881 -0400] - ERR - dblayer_force_checkpoint - Checkpoint FAILED, error Invalid argument (22) [20/Oct/2020:21:29:29.175389079 -0400] - INFO - bdb_db2index - userRoot: Finished indexing. [20/Oct/2020:21:29:29.196723797 -0400] - INFO - bdb_pre_close - All database threads now stopped
-------------------------------Captured log call--------------------------------
INFO  lib389:__init__.py:3073 Running script: ['/usr/bin/dbscan', '-f', '/var/lib/dirsrv/slapd-standalone1/db/userRoot/cn.db'] INFO  lib389:ticket48252_test.py:63 Did not found key test_user0 in dbscan output INFO  lib389:__init__.py:3073 Running script: ['/usr/bin/dbscan', '-f', '/var/lib/dirsrv/slapd-standalone1/db/userRoot/cn.db'] INFO  lib389:ticket48252_test.py:63 Did not found key test_user0 in dbscan output
Passed tickets/ticket48252_test.py::test_ticket48252_run_1 4.13
------------------------------Captured stderr call------------------------------
[20/Oct/2020:21:29:36.729345766 -0400] - INFO - slapd_exemode_db2index - Backend Instance: userRoot [20/Oct/2020:21:29:36.737593742 -0400] - INFO - ldbm_instance_config_cachememsize_set - force a minimal value 512000 [20/Oct/2020:21:29:36.744715204 -0400] - INFO - bdb_instance_start - Import is running with nsslapd-db-private-import-mem on; No other process is allowed to access the database [20/Oct/2020:21:29:36.752655232 -0400] - INFO - check_and_set_import_cache - pagesize: 4096, available bytes 7406592000, process usage 22970368 [20/Oct/2020:21:29:36.756014536 -0400] - INFO - check_and_set_import_cache - Import allocates 2893200KB import cache. [20/Oct/2020:21:29:36.969838031 -0400] - INFO - bdb_db2index - userRoot: Indexing attribute: objectclass [20/Oct/2020:21:29:36.973215619 -0400] - ERR - libdb - BDB1566 txn_checkpoint interface requires an environment configured for the transaction subsystem [20/Oct/2020:21:29:36.975653232 -0400] - ERR - dblayer_force_checkpoint - Checkpoint FAILED, error Invalid argument (22) [20/Oct/2020:21:29:36.983201383 -0400] - INFO - bdb_db2index - userRoot: Finished indexing. [20/Oct/2020:21:29:37.004690185 -0400] - INFO - bdb_pre_close - All database threads now stopped
Passed tickets/ticket48265_test.py::test_ticket48265_test 1.01
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48265_test:ticket48265_test.py:34 Adding 20 test entries... INFO  tests.tickets.ticket48265_test:ticket48265_test.py:51 Search with Ticket 47521 type complex filter INFO  tests.tickets.ticket48265_test:ticket48265_test.py:60 Search with Ticket 48265 type complex filter INFO  tests.tickets.ticket48265_test:ticket48265_test.py:69 Test 48265 complete
Passed tickets/ticket48270_test.py::test_ticket48270_init 0.62
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48270_test:ticket48270_test.py:26 Initialization: add dummy entries for the tests
Passed tickets/ticket48270_test.py::test_ticket48270_homeDirectory_indexed_cis 2.03
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48270_test:ticket48270_test.py:39 index homeDirectory in caseIgnoreIA5Match and caseExactIA5Match INFO  tests.tickets.ticket48270_test:ticket48270_test.py:57 successfully checked that filter with exact mr , a filter with lowercase eq is failing INFO  lib389:tasks.py:798 Index task index_attrs_10202020_213033 completed successfully INFO  tests.tickets.ticket48270_test:ticket48270_test.py:63 Check indexing succeeded with a specified matching rule
Passed tickets/ticket48270_test.py::test_ticket48270_homeDirectory_mixed_value 0.00
No log output captured.
Passed tickets/ticket48270_test.py::test_ticket48270_extensible_search 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48270_test:ticket48270_test.py:91 Default: can retrieve an entry filter syntax with exact stored value INFO  tests.tickets.ticket48270_test:ticket48270_test.py:93 Default: can retrieve an entry filter caseExactIA5Match with exact stored value INFO  tests.tickets.ticket48270_test:ticket48270_test.py:97 Default: can not retrieve an entry filter syntax match with lowered stored value INFO  tests.tickets.ticket48270_test:ticket48270_test.py:103 Default: can not retrieve an entry filter caseExactIA5Match with lowered stored value INFO  tests.tickets.ticket48270_test:ticket48270_test.py:110 Default: can retrieve an entry filter caseIgnoreIA5Match with lowered stored value
Passed tickets/ticket48272_test.py::test_ticket48272 9.43
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48272_test:ticket48272_test.py:129 Test PASSED
Passed tickets/ticket48294_test.py::test_48294_init 0.22
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket48294_test.py:31 ############################################### INFO  lib389:ticket48294_test.py:32 ####### Testing Ticket 48294 - Linked Attributes plug-in - won't update links after MODRDN operation INFO  lib389:ticket48294_test.py:33 ###############################################
Passed tickets/ticket48294_test.py::test_48294_run_0 0.26
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket48294_test.py:31 ############################################### INFO  lib389:ticket48294_test.py:32 ####### Case 0 - Rename employee1 and adjust the link type value by replace INFO  lib389:ticket48294_test.py:33 ############################################### INFO  lib389:ticket48294_test.py:59 ######################### MODRDN uid=employee2 ######################
Passed tickets/ticket48294_test.py::test_48294_run_1 0.16
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket48294_test.py:31 ############################################### INFO  lib389:ticket48294_test.py:32 ####### Case 1 - Rename employee2 and adjust the link type value by delete and add INFO  lib389:ticket48294_test.py:33 ############################################### INFO  lib389:ticket48294_test.py:59 ######################### MODRDN uid=employee3 ######################
Passed tickets/ticket48294_test.py::test_48294_run_2 0.02
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket48294_test.py:31 ############################################### INFO  lib389:ticket48294_test.py:32 ####### Case 2 - Rename manager1 to manager2 and make sure the managed attribute value is updated INFO  lib389:ticket48294_test.py:33 ############################################### INFO  lib389:ticket48294_test.py:59 ######################### MODRDN uid=manager2 ######################
Passed tickets/ticket48295_test.py::test_48295_init 0.53
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket48295_test.py:30 ############################################### INFO  lib389:ticket48295_test.py:31 ####### Testing Ticket 48295 - Entry cache is not rolled back -- Linked Attributes plug-in - wrong behaviour when adding valid and broken links INFO  lib389:ticket48295_test.py:32 ###############################################
Passed tickets/ticket48295_test.py::test_48295_run 0.05
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket48295_test.py:30 ############################################### INFO  lib389:ticket48295_test.py:31 ####### Add 2 linktypes to manager1 - one exists, another does not to make sure the managed entry does not have managed type. INFO  lib389:ticket48295_test.py:32 ###############################################
Passed tickets/ticket48312_test.py::test_ticket48312 0.59
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48312_test:ticket48312_test.py:117 Test complete
Passed tickets/ticket48354_test.py::test_ticket48354 0.00
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48354_test:ticket48354_test.py:50 Test PASSED
Passed tickets/ticket48366_test.py::test_ticket48366_init 0.40
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket48366_test.py:44 Add subtree: ou=green,dc=example,dc=com INFO  lib389:ticket48366_test.py:48 Add subtree: ou=red,dc=example,dc=com INFO  lib389:ticket48366_test.py:54 Add cn=test,ou=people,dc=example,dc=com INFO  lib389:ticket48366_test.py:60 Add cn=proxy,ou=people,dc=example,dc=com INFO  lib389.utils:ticket48366_test.py:90 Adding %d test entries...
Passed tickets/ticket48366_test.py::test_ticket48366_search_user 0.06
No log output captured.
Passed tickets/ticket48366_test.py::test_ticket48366_search_dm 0.18
No log output captured.
Passed tickets/ticket48370_test.py::test_ticket48370 0.11
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48370_test:ticket48370_test.py:187 Test PASSED
Passed tickets/ticket48383_test.py::test_ticket48383 46.19
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
------------------------------Captured stderr call------------------------------
ldiffile: /var/lib/dirsrv/slapd-standalone1/ldif/standalone1.ldif
-------------------------------Captured log call--------------------------------
CRITICAL tests.tickets.ticket48383_test:ticket48383_test.py:62 Failed to change nsslapd-cachememsize No such object INFO  tests.tickets.ticket48383_test:ticket48383_test.py:88 Test complete
Passed tickets/ticket48497_test.py::test_ticket48497_init 0.85
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48497_test:ticket48497_test.py:26 Initialization: add dummy entries for the tests
Passed tickets/ticket48497_test.py::test_ticket48497_homeDirectory_mixed_value 0.00
No log output captured.
Passed tickets/ticket48497_test.py::test_ticket48497_extensible_search 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48497_test:ticket48497_test.py:49 Default: can retrieve an entry filter syntax with exact stored value INFO  tests.tickets.ticket48497_test:ticket48497_test.py:51 Default: can retrieve an entry filter caseExactIA5Match with exact stored value INFO  tests.tickets.ticket48497_test:ticket48497_test.py:55 Default: can not retrieve an entry filter syntax match with lowered stored value INFO  tests.tickets.ticket48497_test:ticket48497_test.py:61 Default: can not retrieve an entry filter caseExactIA5Match with lowered stored value INFO  tests.tickets.ticket48497_test:ticket48497_test.py:68 Default: can retrieve an entry filter caseIgnoreIA5Match with lowered stored value
Passed tickets/ticket48497_test.py::test_ticket48497_homeDirectory_index_cfg 0.01
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48497_test:ticket48497_test.py:73 index homeDirectory in caseIgnoreIA5Match and caseExactIA5Match
Passed tickets/ticket48497_test.py::test_ticket48497_homeDirectory_index_run 2.02
-------------------------------Captured log call--------------------------------
INFO  lib389:tasks.py:798 Index task index_attrs_10202020_213456 completed successfully INFO  tests.tickets.ticket48497_test:ticket48497_test.py:93 Check indexing succeeded with a specified matching rule
Passed tickets/ticket48665_test.py::test_ticket48665 0.30
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48665_test:ticket48665_test.py:40 5 entries are returned from the server. CRITICAL tests.tickets.ticket48665_test:ticket48665_test.py:47 Failed to change nsslapd-cachememsize No such object INFO  tests.tickets.ticket48665_test:ticket48665_test.py:52 5 entries are returned from the server. INFO  tests.tickets.ticket48665_test:ticket48665_test.py:63 5 entries are returned from the server. INFO  tests.tickets.ticket48665_test:ticket48665_test.py:65 Test complete
Passed tickets/ticket48745_test.py::test_ticket48745_init 0.61
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48745_test:ticket48745_test.py:26 Initialization: add dummy entries for the tests
Passed tickets/ticket48745_test.py::test_ticket48745_homeDirectory_indexed_cis 2.02
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48745_test:ticket48745_test.py:39 index homeDirectory in caseIgnoreIA5Match and caseExactIA5Match INFO  tests.tickets.ticket48745_test:ticket48745_test.py:57 successfully checked that filter with exact mr , a filter with lowercase eq is failing INFO  lib389:tasks.py:798 Index task index_attrs_10202020_213524 completed successfully INFO  tests.tickets.ticket48745_test:ticket48745_test.py:63 Check indexing succeeded with a specified matching rule
Passed tickets/ticket48745_test.py::test_ticket48745_homeDirectory_mixed_value 0.00
No log output captured.
Passed tickets/ticket48745_test.py::test_ticket48745_extensible_search_after_index 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48745_test:ticket48745_test.py:91 Default: can retrieve an entry filter syntax with exact stored value INFO  tests.tickets.ticket48745_test:ticket48745_test.py:99 Default: can retrieve an entry filter caseExactIA5Match with exact stored value INFO  tests.tickets.ticket48745_test:ticket48745_test.py:106 Default: can not retrieve an entry filter syntax match with lowered stored value INFO  tests.tickets.ticket48745_test:ticket48745_test.py:112 Default: can not retrieve an entry filter caseExactIA5Match with lowered stored value INFO  tests.tickets.ticket48745_test:ticket48745_test.py:119 Default: can retrieve an entry filter caseIgnoreIA5Match with lowered stored value
Passed tickets/ticket48746_test.py::test_ticket48746_init 0.86
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48746_test:ticket48746_test.py:26 Initialization: add dummy entries for the tests
Passed tickets/ticket48746_test.py::test_ticket48746_homeDirectory_indexed_cis 2.02
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48746_test:ticket48746_test.py:39 index homeDirectory in caseIgnoreIA5Match and caseExactIA5Match INFO  tests.tickets.ticket48746_test:ticket48746_test.py:57 successfully checked that filter with exact mr , a filter with lowercase eq is failing INFO  lib389:tasks.py:798 Index task index_attrs_10202020_213534 completed successfully INFO  tests.tickets.ticket48746_test:ticket48746_test.py:63 Check indexing succeeded with a specified matching rule
Passed tickets/ticket48746_test.py::test_ticket48746_homeDirectory_mixed_value 0.00
No log output captured.
Passed tickets/ticket48746_test.py::test_ticket48746_extensible_search_after_index 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48746_test:ticket48746_test.py:99 Default: can retrieve an entry filter caseExactIA5Match with exact stored value
Passed tickets/ticket48746_test.py::test_ticket48746_homeDirectory_indexed_ces 2.02
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48746_test:ticket48746_test.py:104 index homeDirectory in caseExactIA5Match, this would trigger the crash INFO  tests.tickets.ticket48746_test:ticket48746_test.py:121 successfully checked that filter with exact mr , a filter with lowercase eq is failing INFO  lib389:tasks.py:798 Index task index_attrs_10202020_213536 completed successfully INFO  tests.tickets.ticket48746_test:ticket48746_test.py:127 Check indexing succeeded with a specified matching rule
Passed tickets/ticket48759_test.py::test_ticket48759 21.98
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket48759_test.py:66 !!!!!!! uid=member2,dc=example,dc=com: memberof->b'cn=group,dc=example,dc=com' INFO  lib389:ticket48759_test.py:66 !!!!!!! uid=member2,dc=example,dc=com: memberof->b'cn=group,dc=example,dc=com' INFO  lib389:ticket48759_test.py:66 !!!!!!! uid=member2,dc=example,dc=com: memberof->b'cn=group,dc=example,dc=com'
Passed tickets/ticket48808_test.py::test_ticket48808 6.35
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48808_test:ticket48808_test.py:159 Run multiple paging controls on a single connection INFO  tests.tickets.ticket48808_test:ticket48808_test.py:48 Adding 100 users INFO  tests.tickets.ticket48808_test:ticket48808_test.py:166 Set user bind INFO  tests.tickets.ticket48808_test:ticket48808_test.py:169 Create simple paged results control instance INFO  tests.tickets.ticket48808_test:ticket48808_test.py:174 Iteration 0 INFO  tests.tickets.ticket48808_test:ticket48808_test.py:174 Iteration 1 INFO  tests.tickets.ticket48808_test:ticket48808_test.py:174 Iteration 2 INFO  tests.tickets.ticket48808_test:ticket48808_test.py:193 Set Directory Manager bind back INFO  tests.tickets.ticket48808_test:ticket48808_test.py:75 Deleting 100 users INFO  tests.tickets.ticket48808_test:ticket48808_test.py:197 Abandon the search INFO  tests.tickets.ticket48808_test:ticket48808_test.py:48 Adding 10 users INFO  tests.tickets.ticket48808_test:ticket48808_test.py:204 Set user bind INFO  tests.tickets.ticket48808_test:ticket48808_test.py:207 Create simple paged results control instance INFO  tests.tickets.ticket48808_test:ticket48808_test.py:224 Set Directory Manager bind back INFO  tests.tickets.ticket48808_test:ticket48808_test.py:75 Deleting 10 users INFO  tests.tickets.ticket48808_test:ticket48808_test.py:228 Search should fail with 'nsPagedSizeLimit = 5'and 'nsslapd-pagedsizelimit = 15' with 10 users INFO  tests.tickets.ticket48808_test:ticket48808_test.py:48 Adding 10 users INFO  tests.tickets.ticket48808_test:ticket48808_test.py:95 Set nsslapd-pagedsizelimit to b'15'. Previous value - [b'0']. Modified suffix - cn=config. INFO  tests.tickets.ticket48808_test:ticket48808_test.py:95 Set nsPagedSizeLimit to b'5'. Previous value - None. Modified suffix - uid=simplepaged_test,dc=example,dc=com. INFO  tests.tickets.ticket48808_test:ticket48808_test.py:243 Set user bind INFO  tests.tickets.ticket48808_test:ticket48808_test.py:246 Create simple paged results control instance INFO  tests.tickets.ticket48808_test:ticket48808_test.py:250 Expect to fail with SIZELIMIT_EXCEEDED INFO  tests.tickets.ticket48808_test:ticket48808_test.py:130 Getting page 0 INFO  tests.tickets.ticket48808_test:ticket48808_test.py:255 Set Directory Manager bind back INFO  tests.tickets.ticket48808_test:ticket48808_test.py:75 Deleting 10 users INFO  tests.tickets.ticket48808_test:ticket48808_test.py:95 Set nsslapd-pagedsizelimit to [b'0']. Previous value - [b'15']. Modified suffix - cn=config. INFO  tests.tickets.ticket48808_test:ticket48808_test.py:95 Set nsPagedSizeLimit to None. Previous value - [b'5']. Modified suffix - uid=simplepaged_test,dc=example,dc=com. INFO  tests.tickets.ticket48808_test:ticket48808_test.py:263 Search should pass with 'nsPagedSizeLimit = 15'and 'nsslapd-pagedsizelimit = 5' with 10 users INFO  tests.tickets.ticket48808_test:ticket48808_test.py:48 Adding 10 users INFO  tests.tickets.ticket48808_test:ticket48808_test.py:95 Set nsslapd-pagedsizelimit to b'5'. Previous value - [b'0']. Modified suffix - cn=config. INFO  tests.tickets.ticket48808_test:ticket48808_test.py:95 Set nsPagedSizeLimit to b'15'. Previous value - None. Modified suffix - uid=simplepaged_test,dc=example,dc=com. INFO  tests.tickets.ticket48808_test:ticket48808_test.py:277 Set user bind INFO  tests.tickets.ticket48808_test:ticket48808_test.py:280 Create simple paged results control instance INFO  tests.tickets.ticket48808_test:ticket48808_test.py:284 Search should PASS INFO  tests.tickets.ticket48808_test:ticket48808_test.py:130 Getting page 0 INFO  tests.tickets.ticket48808_test:ticket48808_test.py:287 10 results INFO  tests.tickets.ticket48808_test:ticket48808_test.py:290 Set Directory Manager bind back INFO  tests.tickets.ticket48808_test:ticket48808_test.py:75 Deleting 10 users INFO  tests.tickets.ticket48808_test:ticket48808_test.py:95 Set nsslapd-pagedsizelimit to [b'0']. Previous value - [b'5']. Modified suffix - cn=config. INFO  tests.tickets.ticket48808_test:ticket48808_test.py:95 Set nsPagedSizeLimit to None. Previous value - [b'15']. Modified suffix - uid=simplepaged_test,dc=example,dc=com.
Passed tickets/ticket48844_test.py::test_ticket48844_init 1.75
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  lib389:backend.py:80 List backend with suffix=dc=bitwise,dc=com INFO  lib389:backend.py:290 Creating a local backend INFO  lib389:backend.py:76 List backend cn=TestBitw,cn=ldbm database,cn=plugins,cn=config INFO  lib389:__init__.py:1772 Found entry dn: cn=TestBitw,cn=ldbm database,cn=plugins,cn=config cn: TestBitw nsslapd-cachememsize: 512000 nsslapd-cachesize: -1 nsslapd-directory: /var/lib/dirsrv/slapd-standalone1/db/TestBitw nsslapd-dncachememsize: 16777216 nsslapd-readonly: off nsslapd-require-index: off nsslapd-require-internalop-index: off nsslapd-suffix: dc=bitwise,dc=com objectClass: top objectClass: extensibleObject objectClass: nsBackendInstance INFO  lib389:mappingTree.py:154 Entry dn: cn="dc=bitwise,dc=com",cn=mapping tree,cn=config cn: dc=bitwise,dc=com nsslapd-backend: TestBitw nsslapd-state: backend objectclass: top objectclass: extensibleObject objectclass: nsMappingTree INFO  lib389:__init__.py:1772 Found entry dn: cn=dc\3Dbitwise\2Cdc\3Dcom,cn=mapping tree,cn=config cn: dc=bitwise,dc=com nsslapd-backend: TestBitw nsslapd-state: backend objectClass: top objectClass: extensibleObject objectClass: nsMappingTree
Passed tickets/ticket48844_test.py::test_ticket48844_bitwise_on 3.42
No log output captured.
Passed tickets/ticket48844_test.py::test_ticket48844_bitwise_off 4.33
No log output captured.
Passed tickets/ticket48891_test.py::test_ticket48891_setup 2.35
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket48891_test.py:43 Bind as cn=Directory Manager INFO  lib389:ticket48891_test.py:52 ######################### SETUP SUFFIX o=ticket48891.org ###################### INFO  lib389:backend.py:80 List backend with suffix=dc=ticket48891.org INFO  lib389:backend.py:290 Creating a local backend INFO  lib389:backend.py:76 List backend cn=ticket48891,cn=ldbm database,cn=plugins,cn=config INFO  lib389:__init__.py:1772 Found entry dn: cn=ticket48891,cn=ldbm database,cn=plugins,cn=config cn: ticket48891 nsslapd-cachememsize: 512000 nsslapd-cachesize: -1 nsslapd-directory: /var/lib/dirsrv/slapd-standalone1/db/ticket48891 nsslapd-dncachememsize: 16777216 nsslapd-readonly: off nsslapd-require-index: off nsslapd-require-internalop-index: off nsslapd-suffix: dc=ticket48891.org objectClass: top objectClass: extensibleObject objectClass: nsBackendInstance INFO  lib389:mappingTree.py:154 Entry dn: cn="dc=ticket48891.org",cn=mapping tree,cn=config cn: dc=ticket48891.org nsslapd-backend: ticket48891 nsslapd-state: backend objectclass: top objectclass: extensibleObject objectclass: nsMappingTree INFO  lib389:__init__.py:1772 Found entry dn: cn=dc\3Dticket48891.org,cn=mapping tree,cn=config cn: dc=ticket48891.org nsslapd-backend: ticket48891 nsslapd-state: backend objectClass: top objectClass: extensibleObject objectClass: nsMappingTree INFO  lib389:ticket48891_test.py:61 ######################### Generate Test data ###################### INFO  lib389:ticket48891_test.py:77 ######################### SEARCH ALL ###################### INFO  lib389:ticket48891_test.py:78 Bind as cn=Directory Manager and add the READ/SEARCH SELFDN aci INFO  lib389:ticket48891_test.py:82 Returned 10 entries. INFO  lib389:ticket48891_test.py:86 10 person entries are successfully created under dc=ticket48891.org.
Passed tickets/ticket48893_test.py::test_ticket48893 0.00
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48893_test:ticket48893_test.py:46 Test PASSED
Passed tickets/ticket48896_test.py::test_ticket48896 1.42
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48896_test:ticket48896_test.py:69 Testing Ticket 48896 - Default Setting for passwordMinTokenLength does not work INFO  tests.tickets.ticket48896_test:ticket48896_test.py:71 Setting global password policy with password syntax. INFO  tests.tickets.ticket48896_test:ticket48896_test.py:80 Default passwordMinTokenLength == b'3' INFO  tests.tickets.ticket48896_test:ticket48896_test.py:81 Default passwordInHistory == b'6' INFO  tests.tickets.ticket48896_test:ticket48896_test.py:83 Adding a user. INFO  tests.tickets.ticket48896_test:ticket48896_test.py:47 Binding as {uid=buser123,dc=example,dc=com, password} INFO  tests.tickets.ticket48896_test:ticket48896_test.py:51 Replacing password: password -> Abcd012+, which should be ok INFO  tests.tickets.ticket48896_test:ticket48896_test.py:63 PASSED INFO  tests.tickets.ticket48896_test:ticket48896_test.py:47 Binding as {uid=buser123,dc=example,dc=com, Abcd012+} INFO  tests.tickets.ticket48896_test:ticket48896_test.py:51 Replacing password: Abcd012+ -> user, which should fail INFO  tests.tickets.ticket48896_test:ticket48896_test.py:55 Exception (expected): CONSTRAINT_VIOLATION INFO  tests.tickets.ticket48896_test:ticket48896_test.py:63 PASSED INFO  tests.tickets.ticket48896_test:ticket48896_test.py:47 Binding as {uid=buser123,dc=example,dc=com, Abcd012+} INFO  tests.tickets.ticket48896_test:ticket48896_test.py:51 Replacing password: Abcd012+ -> buser123, which should fail INFO  tests.tickets.ticket48896_test:ticket48896_test.py:55 Exception (expected): CONSTRAINT_VIOLATION INFO  tests.tickets.ticket48896_test:ticket48896_test.py:63 PASSED INFO  tests.tickets.ticket48896_test:ticket48896_test.py:47 Binding as {uid=buser123,dc=example,dc=com, Abcd012+} INFO  tests.tickets.ticket48896_test:ticket48896_test.py:51 Replacing password: Abcd012+ -> Tuse!1234, which should fail INFO  tests.tickets.ticket48896_test:ticket48896_test.py:55 Exception (expected): CONSTRAINT_VIOLATION INFO  tests.tickets.ticket48896_test:ticket48896_test.py:63 PASSED INFO  tests.tickets.ticket48896_test:ticket48896_test.py:47 Binding as {uid=buser123,dc=example,dc=com, Abcd012+} INFO  tests.tickets.ticket48896_test:ticket48896_test.py:51 Replacing password: Abcd012+ -> Tuse!0987, which should fail INFO  tests.tickets.ticket48896_test:ticket48896_test.py:55 Exception (expected): CONSTRAINT_VIOLATION INFO  tests.tickets.ticket48896_test:ticket48896_test.py:63 PASSED INFO  tests.tickets.ticket48896_test:ticket48896_test.py:47 Binding as {uid=buser123,dc=example,dc=com, Abcd012+} INFO  tests.tickets.ticket48896_test:ticket48896_test.py:51 Replacing password: Abcd012+ -> Tabc!1234, which should fail INFO  tests.tickets.ticket48896_test:ticket48896_test.py:55 Exception (expected): CONSTRAINT_VIOLATION INFO  tests.tickets.ticket48896_test:ticket48896_test.py:63 PASSED INFO  tests.tickets.ticket48896_test:ticket48896_test.py:47 Binding as {uid=buser123,dc=example,dc=com, Abcd012+} INFO  tests.tickets.ticket48896_test:ticket48896_test.py:51 Replacing password: Abcd012+ -> Direc+ory389, which should be ok INFO  tests.tickets.ticket48896_test:ticket48896_test.py:63 PASSED INFO  tests.tickets.ticket48896_test:ticket48896_test.py:132 SUCCESS
Passed tickets/ticket48906_test.py::test_ticket48906_setup 0.81
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket48906_test.py:63 Bind as cn=Directory Manager INFO  lib389:ticket48906_test.py:83 ######################### SEARCH ALL ###################### INFO  lib389:ticket48906_test.py:84 Bind as cn=Directory Manager and add the READ/SEARCH SELFDN aci INFO  lib389:ticket48906_test.py:88 Returned 10 entries. INFO  lib389:ticket48906_test.py:92 10 person entries are successfully created under dc=example,dc=com.
Passed tickets/ticket48906_test.py::test_ticket48906_dblock_default 0.01
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket48906_test.py:149 ################################### INFO  lib389:ticket48906_test.py:150 ### INFO  lib389:ticket48906_test.py:151 ### Check that before any change config/monitor INFO  lib389:ticket48906_test.py:152 ### contains the default value INFO  lib389:ticket48906_test.py:153 ### INFO  lib389:ticket48906_test.py:154 ###################################
Passed tickets/ticket48906_test.py::test_ticket48906_dblock_ldap_update 2.68
------------------------------Captured stdout call------------------------------
line locks:10000 expected_value 10000 value 10000
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket48906_test.py:160 ################################### INFO  lib389:ticket48906_test.py:161 ### INFO  lib389:ticket48906_test.py:162 ### Check that after ldap update INFO  lib389:ticket48906_test.py:163 ### - monitor contains DEFAULT INFO  lib389:ticket48906_test.py:164 ### - configured contains DBLOCK_LDAP_UPDATE INFO  lib389:ticket48906_test.py:165 ### - After stop dse.ldif contains DBLOCK_LDAP_UPDATE INFO  lib389:ticket48906_test.py:166 ### - After stop guardian contains DEFAULT INFO  lib389:ticket48906_test.py:167 ### In fact guardian should differ from config to recreate the env INFO  lib389:ticket48906_test.py:168 ### Check that after restart (DBenv recreated) INFO  lib389:ticket48906_test.py:169 ### - monitor contains DBLOCK_LDAP_UPDATE INFO  lib389:ticket48906_test.py:170 ### - configured contains DBLOCK_LDAP_UPDATE INFO  lib389:ticket48906_test.py:171 ### - dse.ldif contains DBLOCK_LDAP_UPDATE INFO  lib389:ticket48906_test.py:172 ### INFO  lib389:ticket48906_test.py:173 ###################################
Passed tickets/ticket48906_test.py::test_ticket48906_dblock_edit_update 6.10
------------------------------Captured stdout call------------------------------
line locks:20000 expected_value 20000 value 20000 line locks:40000 expected_value 40000 value 40000
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket48906_test.py:191 ################################### INFO  lib389:ticket48906_test.py:192 ### INFO  lib389:ticket48906_test.py:193 ### Check that after stop INFO  lib389:ticket48906_test.py:194 ### - dse.ldif contains DBLOCK_LDAP_UPDATE INFO  lib389:ticket48906_test.py:195 ### - guardian contains DBLOCK_LDAP_UPDATE INFO  lib389:ticket48906_test.py:196 ### Check that edit dse+restart INFO  lib389:ticket48906_test.py:197 ### - monitor contains DBLOCK_EDIT_UPDATE INFO  lib389:ticket48906_test.py:198 ### - configured contains DBLOCK_EDIT_UPDATE INFO  lib389:ticket48906_test.py:199 ### Check that after stop INFO  lib389:ticket48906_test.py:200 ### - dse.ldif contains DBLOCK_EDIT_UPDATE INFO  lib389:ticket48906_test.py:201 ### - guardian contains DBLOCK_EDIT_UPDATE INFO  lib389:ticket48906_test.py:202 ### INFO  lib389:ticket48906_test.py:203 ###################################
Passed tickets/ticket48906_test.py::test_ticket48906_dblock_robust 5.68
------------------------------Captured stdout call------------------------------
line locks:40000 expected_value 40000 value 40000
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket48906_test.py:245 ################################### INFO  lib389:ticket48906_test.py:246 ### INFO  lib389:ticket48906_test.py:247 ### Check that the following values are rejected INFO  lib389:ticket48906_test.py:248 ### - negative value INFO  lib389:ticket48906_test.py:249 ### - insuffisant value INFO  lib389:ticket48906_test.py:250 ### - invalid value INFO  lib389:ticket48906_test.py:251 ### Check that minimum value is accepted INFO  lib389:ticket48906_test.py:252 ### INFO  lib389:ticket48906_test.py:253 ###################################
Passed tickets/ticket48956_test.py::test_ticket48956 39.31
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
------------------------------Captured stderr call------------------------------
Additional information: Invalid attribute in filter - results may not be complete. Additional information: Invalid attribute in filter - results may not be complete. Additional information: Invalid attribute in filter - results may not be complete. Additional information: Invalid attribute in filter - results may not be complete.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48956_test:ticket48956_test.py:54 ######################### Adding Account Policy entry: cn=Account Inactivation Policy,dc=example,dc=com ###################### INFO  tests.tickets.ticket48956_test:ticket48956_test.py:61 ######################### Adding Test User entry: uid=ticket48956user,dc=example,dc=com ###################### INFO  tests.tickets.ticket48956_test:ticket48956_test.py:45 output: b'uid=ticket48956user,dc=example,dc=com - activated.\n' INFO  tests.tickets.ticket48956_test:ticket48956_test.py:45 output: b'uid=ticket48956user,dc=example,dc=com - inactivated (inactivity limit exceeded).\n' INFO  lib389:backend.py:80 List backend with suffix=dc=this,dc=is,dc=a,dc=very,dc=long,dc=suffix,dc=so,dc=long,dc=suffix,dc=extremely,dc=long,dc=suffix INFO  lib389:backend.py:290 Creating a local backend INFO  lib389:backend.py:76 List backend cn=ticket48956,cn=ldbm database,cn=plugins,cn=config INFO  lib389:__init__.py:1772 Found entry dn: cn=ticket48956,cn=ldbm database,cn=plugins,cn=config cn: ticket48956 nsslapd-cachememsize: 512000 nsslapd-cachesize: -1 nsslapd-directory: /var/lib/dirsrv/slapd-standalone1/db/ticket48956 nsslapd-dncachememsize: 16777216 nsslapd-readonly: off nsslapd-require-index: off nsslapd-require-internalop-index: off nsslapd-suffix: dc=this,dc=is,dc=a,dc=very,dc=long,dc=suffix,dc=so,dc=long,dc=suffix,dc=extremely,dc=long,dc=suffix objectClass: top objectClass: extensibleObject objectClass: nsBackendInstance INFO  lib389:mappingTree.py:154 Entry dn: cn="dc=this,dc=is,dc=a,dc=very,dc=long,dc=suffix,dc=so,dc=long,dc=suffix,dc=extremely,dc=long,dc=suffix",cn=mapping tree,cn=config cn: dc=this,dc=is,dc=a,dc=very,dc=long,dc=suffix,dc=so,dc=long,dc=suffix,dc=extremely,dc=long,dc=suffix nsslapd-backend: ticket48956 nsslapd-state: backend objectclass: top objectclass: extensibleObject objectclass: nsMappingTree INFO  lib389:__init__.py:1772 Found entry dn: cn=dc\3Dthis\2Cdc\3Dis\2Cdc\3Da\2Cdc\3Dvery\2Cdc\3Dlong\2Cdc\3Dsuffix\2Cdc\3Dso\2Cdc\3Dlong\2Cdc\3Dsuffix\2Cdc\3Dextremely\2Cdc\3Dlong\2Cdc\3Dsuffix,cn=mapping tree,cn=config cn: dc=this,dc=is,dc=a,dc=very,dc=long,dc=suffix,dc=so,dc=long,dc=suffix,dc=extremely,dc=long,dc=suffix nsslapd-backend: ticket48956 nsslapd-state: backend objectClass: top objectClass: extensibleObject objectClass: nsMappingTree INFO  tests.tickets.ticket48956_test:ticket48956_test.py:54 ######################### Adding Account Policy entry: cn=Account Inactivation Policy,dc=this,dc=is,dc=a,dc=very,dc=long,dc=suffix,dc=so,dc=long,dc=suffix,dc=extremely,dc=long,dc=suffix ###################### INFO  tests.tickets.ticket48956_test:ticket48956_test.py:61 ######################### Adding Test User entry: uid=ticket48956user,dc=this,dc=is,dc=a,dc=very,dc=long,dc=suffix,dc=so,dc=long,dc=suffix,dc=extremely,dc=long,dc=suffix ###################### INFO  tests.tickets.ticket48956_test:ticket48956_test.py:45 output: b'uid=ticket48956user,dc=this,dc=is,dc=a,dc=very,dc=long,dc=suffix,dc=so,dc=long,dc=suffix,dc=extremely,dc=long,dc=suffix - activated.\n' INFO  tests.tickets.ticket48956_test:ticket48956_test.py:45 output: b'uid=ticket48956user,dc=this,dc=is,dc=a,dc=very,dc=long,dc=suffix,dc=so,dc=long,dc=suffix,dc=extremely,dc=long,dc=suffix - inactivated (inactivity limit exceeded).\n' INFO  tests.tickets.ticket48956_test:ticket48956_test.py:121 Test PASSED
Passed tickets/ticket48973_test.py::test_ticket48973_init 0.59
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48973_test:ticket48973_test.py:191 Initialization: add dummy entries for the tests
Passed tickets/ticket48973_test.py::test_ticket48973_ces_not_indexed 2.56
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48973_test:ticket48973_test.py:211 Search (homeDirectory=/home/xyz_0) INFO  tests.tickets.ticket48973_test:ticket48973_test.py:218 result=[20/Oct/2020:21:41:44.792099230 -0400] conn=2 op=102 RESULT err=0 tag=101 nentries=1 wtime=0.000158816 optime=0.001136945 etime=0.001291801 notes=U details="Partially Unindexed Filter
Passed tickets/ticket48973_test.py::test_ticket48973_homeDirectory_caseExactIA5Match_caseIgnoreIA5Match_indexing 24.44
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48973_test:ticket48973_test.py:273 index homeDirectory in caseExactIA5Match and caseIgnoreIA5Match INFO  lib389:tasks.py:798 Index task index_attrs_10202020_214156 completed successfully INFO  tests.tickets.ticket48973_test:ticket48973_test.py:284 Check indexing succeeded with no specified matching rule INFO  tests.tickets.ticket48973_test:ticket48973_test.py:173 Search (homeDirectory=/home/xyz_4) INFO  tests.tickets.ticket48973_test:ticket48973_test.py:184 result=[20/Oct/2020:21:41:58.078548290 -0400] conn=1 op=9 RESULT err=0 tag=101 nentries=1 wtime=0.000202505 optime=0.000396643 etime=0.000596059 INFO  tests.tickets.ticket48973_test:ticket48973_test.py:173 Search (homeDirectory:caseExactIA5Match:=/home/xyz_4) INFO  tests.tickets.ticket48973_test:ticket48973_test.py:184 result=[20/Oct/2020:21:42:01.358485372 -0400] conn=1 op=2 RESULT err=0 tag=101 nentries=1 wtime=0.000193663 optime=0.000652754 etime=0.000843409 INFO  tests.tickets.ticket48973_test:ticket48973_test.py:173 Search (homeDirectory:caseIgnoreIA5Match:=/home/xyz_4) INFO  tests.tickets.ticket48973_test:ticket48973_test.py:184 result=[20/Oct/2020:21:42:05.077580320 -0400] conn=1 op=2 RESULT err=0 tag=101 nentries=1 wtime=0.000262898 optime=0.000649960 etime=0.000910111 INFO  tests.tickets.ticket48973_test:ticket48973_test.py:173 Search (homeDirectory=/HOME/XYZ_4) INFO  tests.tickets.ticket48973_test:ticket48973_test.py:184 result=[20/Oct/2020:21:42:08.805804799 -0400] conn=1 op=2 RESULT err=0 tag=101 nentries=0 wtime=0.000246233 optime=0.000449373 etime=0.000691966 INFO  tests.tickets.ticket48973_test:ticket48973_test.py:173 Search (homeDirectory:caseExactIA5Match:=/HOME/XYZ_4) INFO  tests.tickets.ticket48973_test:ticket48973_test.py:184 result=[20/Oct/2020:21:42:12.529384189 -0400] conn=1 op=2 RESULT err=0 tag=101 nentries=0 wtime=0.000233753 optime=0.000366694 etime=0.000597225 INFO  tests.tickets.ticket48973_test:ticket48973_test.py:173 Search (homeDirectory:caseIgnoreIA5Match:=/HOME/XYZ_4) INFO  tests.tickets.ticket48973_test:ticket48973_test.py:184 result=[20/Oct/2020:21:42:16.146232055 -0400] conn=1 op=2 RESULT err=0 tag=101 nentries=1 wtime=0.000233867 optime=0.000504384 etime=0.000735333
Passed tickets/ticket49039_test.py::test_ticket49039 17.25
-----------------------------Captured stdout setup------------------------------
Instance slapd-standalone1 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket49039_test:ticket49039_test.py:112 Test Passed
Passed tickets/ticket49072_test.py::test_ticket49072_basedn 4.92
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket49072_test:ticket49072_test.py:40 Ticket 49072 memberof fixup task with invalid basedn... INFO  tests.tickets.ticket49072_test:ticket49072_test.py:59 output: b'Successfully added task entry "cn=memberOf_fixup_2020_10_20_21_44_31, cn=memberOf task, cn=tasks, cn=config"\n' INFO  tests.tickets.ticket49072_test:ticket49072_test.py:63 Error log out: ['[20/Oct/2020:21:44:31.122898626 -0400] - ERR - memberof-plugin - memberof_fixup_task_thread - Failed to get be backend from (dc=testdb,dc=com)\n']
Passed tickets/ticket49072_test.py::test_ticket49072_filter 20.29
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket49072_test:ticket49072_test.py:77 Ticket 49072 memberof fixup task with invalid filter... INFO  tests.tickets.ticket49072_test:ticket49072_test.py:78 Wait for 10 secs and check if task is completed INFO  tests.tickets.ticket49072_test:ticket49072_test.py:82 memberof task is still running, wait for +10 secs INFO  tests.tickets.ticket49072_test:ticket49072_test.py:100 output: b'Successfully added task entry "cn=memberOf_fixup_2020_10_20_21_44_51, cn=memberOf task, cn=tasks, cn=config"\n' INFO  tests.tickets.ticket49072_test:ticket49072_test.py:104 Error log out: ['[20/Oct/2020:21:44:51.539411631 -0400] - ERR - memberof-plugin - memberof_fix_memberof - Failed (Bad search filter)\n'] INFO  tests.tickets.ticket49072_test:ticket49072_test.py:107 Ticket 49072 complete: memberOf fixup task does not validate args
Passed tickets/ticket49076_test.py::test_ticket49076 9.70
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed tickets/ticket49095_test.py::test_ticket49095 1.08
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket49095_test:ticket49095_test.py:79 Test Passed
Passed tickets/ticket49104_test.py::test_ticket49104_setup 35.91
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  lib389:tasks.py:498 Import task import_10202020_214605 for file /var/lib/dirsrv/slapd-standalone1/ldif/49104.ldif completed successfully
Passed tickets/ticket49104_test.py::test_ticket49104 0.22
------------------------------Captured stderr call------------------------------
valgrind: /usr/bin/dbscan-bin: No such file or directory grep: /tmp/val49104.out: No such file or directory valgrind: /usr/bin/dbscan-bin: No such file or directory grep: /tmp/val49104.out: No such file or directory valgrind: /usr/bin/dbscan-bin: No such file or directory grep: /tmp/val49104.out: No such file or directory valgrind: /usr/bin/dbscan-bin: No such file or directory grep: /tmp/val49104.out: No such file or directory valgrind: /usr/bin/dbscan-bin: No such file or directory grep: /tmp/val49104.out: No such file or directory valgrind: /usr/bin/dbscan-bin: No such file or directory grep: /tmp/val49104.out: No such file or directory valgrind: /usr/bin/dbscan-bin: No such file or directory grep: /tmp/val49104.out: No such file or directory valgrind: /usr/bin/dbscan-bin: No such file or directory grep: /tmp/val49104.out: No such file or directory valgrind: /usr/bin/dbscan-bin: No such file or directory grep: /tmp/val49104.out: No such file or directory valgrind: /usr/bin/dbscan-bin: No such file or directory grep: /tmp/val49104.out: No such file or directory
Passed tickets/ticket49122_test.py::test_ticket49122 13.74
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket49122_test:ticket49122_test.py:57 Testing filter: nsrole=cn=empty,dc=example,dc=com INFO  tests.tickets.ticket49122_test:ticket49122_test.py:57 Testing filter: (nsrole=cn=empty,dc=example,dc=com) INFO  tests.tickets.ticket49122_test:ticket49122_test.py:57 Testing filter: (&(nsrole=cn=empty,dc=example,dc=com)) INFO  tests.tickets.ticket49122_test:ticket49122_test.py:57 Testing filter: (!(nsrole=cn=empty,dc=example,dc=com)) INFO  tests.tickets.ticket49122_test:ticket49122_test.py:57 Testing filter: (&(|(objectclass=person)(sn=app*))(userpassword=*)) INFO  tests.tickets.ticket49122_test:ticket49122_test.py:57 Testing filter: (&(|(objectclass=person)(nsrole=cn=empty,dc=example,dc=com))(userpassword=*)) INFO  tests.tickets.ticket49122_test:ticket49122_test.py:57 Testing filter: (&(|(nsrole=cn=empty,dc=example,dc=com)(sn=app*))(userpassword=*)) INFO  tests.tickets.ticket49122_test:ticket49122_test.py:57 Testing filter: (&(|(objectclass=person)(sn=app*))(nsrole=cn=empty,dc=example,dc=com)) INFO  tests.tickets.ticket49122_test:ticket49122_test.py:57 Testing filter: (&(|(&(cn=*)(objectclass=person)(nsrole=cn=empty,dc=example,dc=com)))(uid=*)) INFO  lib389:ticket49122_test.py:86 Test Passed
Passed tickets/ticket49184_test.py::test_ticket49184 5.42
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket49184_test:ticket49184_test.py:89 create users and group... INFO  tests.tickets.ticket49184_test:ticket49184_test.py:39 Adding members to the group... INFO  tests.tickets.ticket49184_test:ticket49184_test.py:39 Adding members to the group...
Passed tickets/ticket49227_test.py::test_ticket49227 20.75
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed tickets/ticket49249_test.py::test_ticket49249 0.07
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed tickets/ticket49273_test.py::test_49273_corrupt_dbversion 3.47
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed tickets/ticket49290_test.py::test_49290_range_unindexed_notes 4.43
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
------------------------------Captured stderr call------------------------------
[20/Oct/2020:21:49:47.838763375 -0400] - INFO - ldbm_instance_config_cachememsize_set - force a minimal value 512000 [20/Oct/2020:21:49:47.848709590 -0400] - INFO - bdb_instance_start - Import is running with nsslapd-db-private-import-mem on; No other process is allowed to access the database [20/Oct/2020:21:49:47.852252891 -0400] - INFO - check_and_set_import_cache - pagesize: 4096, available bytes 7402979328, process usage 22994944 [20/Oct/2020:21:49:47.855193174 -0400] - INFO - check_and_set_import_cache - Import allocates 2891788KB import cache. [20/Oct/2020:21:49:48.033484209 -0400] - INFO - bdb_db2index - userRoot: Indexing attribute: modifytimestamp [20/Oct/2020:21:49:48.043331710 -0400] - ERR - libdb - BDB1566 txn_checkpoint interface requires an environment configured for the transaction subsystem [20/Oct/2020:21:49:48.046403272 -0400] - ERR - dblayer_force_checkpoint - Checkpoint FAILED, error Invalid argument (22) [20/Oct/2020:21:49:48.053458413 -0400] - INFO - bdb_db2index - userRoot: Finished indexing. [20/Oct/2020:21:49:48.075697484 -0400] - INFO - bdb_pre_close - All database threads now stopped
Passed tickets/ticket49386_test.py::test_ticket49386 32.70
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
CRITICAL tests.tickets.ticket49386_test:ticket49386_test.py:28 Adding user (cn=user_0,ou=people,dc=example,dc=com): CRITICAL tests.tickets.ticket49386_test:ticket49386_test.py:28 Adding user (cn=user_1,ou=people,dc=example,dc=com): CRITICAL tests.tickets.ticket49386_test:ticket49386_test.py:28 Adding user (cn=user_2,ou=people,dc=example,dc=com): CRITICAL tests.tickets.ticket49386_test:ticket49386_test.py:28 Adding user (cn=user_3,ou=people,dc=example,dc=com): CRITICAL tests.tickets.ticket49386_test:ticket49386_test.py:28 Adding user (cn=user_4,ou=people,dc=example,dc=com): CRITICAL tests.tickets.ticket49386_test:ticket49386_test.py:28 Adding user (cn=user_5,ou=people,dc=example,dc=com): CRITICAL tests.tickets.ticket49386_test:ticket49386_test.py:28 Adding user (cn=user_6,ou=people,dc=example,dc=com): CRITICAL tests.tickets.ticket49386_test:ticket49386_test.py:28 Adding user (cn=user_7,ou=people,dc=example,dc=com): CRITICAL tests.tickets.ticket49386_test:ticket49386_test.py:28 Adding user (cn=user_8,ou=people,dc=example,dc=com): CRITICAL tests.tickets.ticket49386_test:ticket49386_test.py:28 Adding user (cn=user_9,ou=people,dc=example,dc=com): INFO  lib389:ticket49386_test.py:65 !!!!!!! cn=user_1,ou=people,dc=example,dc=com: memberof->b'cn=group_1,ou=Groups,dc=example,dc=com' INFO  lib389:ticket49386_test.py:66 !!!!!!! b'cn=group_1,ou=Groups,dc=example,dc=com' INFO  lib389:ticket49386_test.py:67 !!!!!!! cn=group_1,ou=groups,dc=example,dc=com INFO  tests.tickets.ticket49386_test:ticket49386_test.py:130 memberof log found: [20/Oct/2020:21:50:50.302497747 -0400] - DEBUG - memberof-plugin - memberof_postop_modrdn: Skip modrdn operation because src/dst identical cn=group_1,ou=Groups,dc=example,dc=com
Passed tickets/ticket49441_test.py::test_ticket49441 8.30
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket49441_test:ticket49441_test.py:35 Position ldif files, and add indexes... INFO  tests.tickets.ticket49441_test:ticket49441_test.py:52 Import LDIF with large indexed binary attributes... ERROR  lib389:tasks.py:495 Error: import task import_10202020_215142 for file /var/lib/dirsrv/slapd-standalone1/ldifbinary.ldif exited with -23 INFO  tests.tickets.ticket49441_test:ticket49441_test.py:61 Verify server is still running... INFO  tests.tickets.ticket49441_test:ticket49441_test.py:68 Test PASSED
Passed tickets/ticket49471_test.py::test_ticket49471 2.03
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
CRITICAL tests.tickets.ticket49471_test:ticket49471_test.py:28 Adding user (cn=user_1,ou=people,dc=example,dc=com):
Passed tickets/ticket49540_test.py::test_ticket49540 18.98
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket49540_test:ticket49540_test.py:28 index homeDirectory INFO  lib389:tasks.py:798 Index task index_attrs_10202020_215354 completed successfully INFO  tests.tickets.ticket49540_test:ticket49540_test.py:62 check_task_status =========> 0 th loop INFO  tests.tickets.ticket49540_test:ticket49540_test.py:71 cn=index_attrs_10202020_215354,cn=index,cn=tasks,cn=config ---> NO STATUS INFO  tests.tickets.ticket49540_test:ticket49540_test.py:83 =========> Great it was expected in the middle of index INFO  tests.tickets.ticket49540_test:ticket49540_test.py:62 check_task_status =========> 1 th loop INFO  tests.tickets.ticket49540_test:ticket49540_test.py:71 cn=index_attrs_10202020_215354,cn=index,cn=tasks,cn=config ---> NO STATUS INFO  tests.tickets.ticket49540_test:ticket49540_test.py:83 =========> Great it was expected in the middle of index INFO  tests.tickets.ticket49540_test:ticket49540_test.py:62 check_task_status =========> 2 th loop INFO  tests.tickets.ticket49540_test:ticket49540_test.py:71 cn=index_attrs_10202020_215354,cn=index,cn=tasks,cn=config ---> NO STATUS INFO  tests.tickets.ticket49540_test:ticket49540_test.py:83 =========> Great it was expected in the middle of index INFO  tests.tickets.ticket49540_test:ticket49540_test.py:62 check_task_status =========> 3 th loop INFO  tests.tickets.ticket49540_test:ticket49540_test.py:71 cn=index_attrs_10202020_215354,cn=index,cn=tasks,cn=config ---> NO STATUS INFO  tests.tickets.ticket49540_test:ticket49540_test.py:83 =========> Great it was expected in the middle of index INFO  tests.tickets.ticket49540_test:ticket49540_test.py:62 check_task_status =========> 4 th loop INFO  tests.tickets.ticket49540_test:ticket49540_test.py:71 cn=index_attrs_10202020_215354,cn=index,cn=tasks,cn=config ---> NO STATUS INFO  tests.tickets.ticket49540_test:ticket49540_test.py:83 =========> Great it was expected in the middle of index INFO  tests.tickets.ticket49540_test:ticket49540_test.py:62 check_task_status =========> 5 th loop INFO  tests.tickets.ticket49540_test:ticket49540_test.py:71 cn=index_attrs_10202020_215354,cn=index,cn=tasks,cn=config ---> NO STATUS INFO  tests.tickets.ticket49540_test:ticket49540_test.py:83 =========> Great it was expected in the middle of index INFO  tests.tickets.ticket49540_test:ticket49540_test.py:62 check_task_status =========> 6 th loop INFO  tests.tickets.ticket49540_test:ticket49540_test.py:68 cn=index_attrs_10202020_215354,cn=index,cn=tasks,cn=config ---> b'userRoot: Finished indexing.' INFO  tests.tickets.ticket49540_test:ticket49540_test.py:62 check_task_status =========> 7 th loop INFO  tests.tickets.ticket49540_test:ticket49540_test.py:68 cn=index_attrs_10202020_215354,cn=index,cn=tasks,cn=config ---> b'userRoot: Finished indexing.' INFO  tests.tickets.ticket49540_test:ticket49540_test.py:62 check_task_status =========> 8 th loop INFO  tests.tickets.ticket49540_test:ticket49540_test.py:68 cn=index_attrs_10202020_215354,cn=index,cn=tasks,cn=config ---> b'userRoot: Finished indexing.' INFO  tests.tickets.ticket49540_test:ticket49540_test.py:62 check_task_status =========> 9 th loop INFO  tests.tickets.ticket49540_test:ticket49540_test.py:68 cn=index_attrs_10202020_215354,cn=index,cn=tasks,cn=config ---> b'userRoot: Finished indexing.'
Passed tickets/ticket49623_2_test.py::test_modrdn_loop 0.07
-----------------------------Captured stdout setup------------------------------
Instance slapd-master1 removed.
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:148 Creating replication topology.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket49623_2_test:ticket49623_2_test.py:64 Check the log messages for cenotaph error
Passed tickets/ticket49788_test.py::test_ticket49781 0.98
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed tickets/ticket50232_test.py::test_ticket50232_normal 10.14
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
------------------------------Captured stderr call------------------------------
ldiffile: /tmp/output_file [20/Oct/2020:21:56:13.174900096 -0400] - INFO - slapd_exemode_ldif2db - Backend Instance: normal
-------------------------------Captured log call--------------------------------
INFO  lib389:backend.py:80 List backend with suffix=o=normal INFO  lib389:backend.py:290 Creating a local backend INFO  lib389:backend.py:76 List backend cn=normal,cn=ldbm database,cn=plugins,cn=config INFO  lib389:__init__.py:1772 Found entry dn: cn=normal,cn=ldbm database,cn=plugins,cn=config cn: normal nsslapd-cachememsize: 512000 nsslapd-cachesize: -1 nsslapd-directory: /var/lib/dirsrv/slapd-standalone1/db/normal nsslapd-dncachememsize: 16777216 nsslapd-readonly: off nsslapd-require-index: off nsslapd-require-internalop-index: off nsslapd-suffix: o=normal objectClass: top objectClass: extensibleObject objectClass: nsBackendInstance INFO  lib389:mappingTree.py:154 Entry dn: cn="o=normal",cn=mapping tree,cn=config cn: o=normal nsslapd-backend: normal nsslapd-state: backend objectclass: top objectclass: extensibleObject objectclass: nsMappingTree INFO  lib389:__init__.py:1772 Found entry dn: cn=o\3Dnormal,cn=mapping tree,cn=config cn: o=normal nsslapd-backend: normal nsslapd-state: backend objectClass: top objectClass: extensibleObject objectClass: nsMappingTree
Passed tickets/ticket50232_test.py::test_ticket50232_reverse 10.25
------------------------------Captured stderr call------------------------------
ldiffile: /tmp/output_file [20/Oct/2020:21:56:23.750266256 -0400] - INFO - slapd_exemode_ldif2db - Backend Instance: reverse
-------------------------------Captured log call--------------------------------
INFO  lib389:backend.py:80 List backend with suffix=o=reverse INFO  lib389:backend.py:290 Creating a local backend INFO  lib389:backend.py:76 List backend cn=reverse,cn=ldbm database,cn=plugins,cn=config INFO  lib389:__init__.py:1772 Found entry dn: cn=reverse,cn=ldbm database,cn=plugins,cn=config cn: reverse nsslapd-cachememsize: 512000 nsslapd-cachesize: -1 nsslapd-directory: /var/lib/dirsrv/slapd-standalone1/db/reverse nsslapd-dncachememsize: 16777216 nsslapd-readonly: off nsslapd-require-index: off nsslapd-require-internalop-index: off nsslapd-suffix: o=reverse objectClass: top objectClass: extensibleObject objectClass: nsBackendInstance INFO  lib389:mappingTree.py:154 Entry dn: cn="o=reverse",cn=mapping tree,cn=config cn: o=reverse nsslapd-backend: reverse nsslapd-state: backend objectclass: top objectclass: extensibleObject objectclass: nsMappingTree INFO  lib389:__init__.py:1772 Found entry dn: cn=o\3Dreverse,cn=mapping tree,cn=config cn: o=reverse nsslapd-backend: reverse nsslapd-state: backend objectClass: top objectClass: extensibleObject objectClass: nsMappingTree
Passed tickets/ticket50234_test.py::test_ticket50234 1.35
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed tickets/ticket548_test.py::test_ticket548_test_with_no_policy 0.09
-------------------------------Captured log setup-------------------------------
INFO  lib389.topologies:topologies.py:115 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed tickets/ticket548_test.py::test_ticket548_test_global_policy 1.66
No log output captured.
Passed tickets/ticket548_test.py::test_ticket548_test_subtree_policy 3.32
No log output captured.