| Commit message (Collapse) | Author | Age | Files | Lines |
|
|
|
|
| |
* Makefile.am: Make sure all sources and data files are
distributed.
|
|
|
|
|
|
| |
* tester/run (die): Do not throw 'permission denied' message if
the die was called too early (and the result dir is still not
created).
|
|
|
|
|
|
| |
* bin/dtf-get-machine.in: Exit if option was parsed but was not
handled explicitly by case statement.
* bin/dtf-run-remote.in: Likewise.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
New script 'dtf-return-machine' returns the VM to OpenStack based
on its public IP. In future, this may be abstracted to any VM
provider (or VM pool or whatever), but that requires also some
IP <=> VM mapping shared between dtf-get-machine and
dtf-return-machine.
* controller/.gitignore: Ignore new scripts.
* controller/Makefile.am: Build new scripts.
* controller/bin/dtf-return-machine.in: New script for VM
* deletion.
* controller/libexec/dtf-nova.in: New wrapper around 'nova'
command, showing only data output where fields are separated by
tabulator.
* controller/share/dtf-controller/ansible/playbooks/fedora.yml:
Finally call dtf-return-machine after successful test run.
|
|
|
|
| |
* controller/bin/dtf-controller.in: Print '\n' after error msg.
|
|
|
|
|
|
|
|
|
|
| |
While polling sshd server on remote host, do not use
PasswordAuthentication even if the server allows that. That
causes problem if the new VM already started but cloud-init was
not-yet able to set the authorized_keys file.
* controller/libexec/dtf-wait-for-ssh: Add the
PasswordAuthentication=no ssh option.
|
|
|
|
| |
* controller/libexec/dtf-wait-for-ssh: Syntax lint.
|
|
|
|
|
| |
* controller/bin/dtf-get-machine.in: Do not try to check for IP
address if 'nova boot' command failed.
|
|
|
|
| |
* controller/controller: Removed.
|
|
|
|
|
|
|
| |
* controller/bin/dtf-controller.in (child_task): Do not exit if
dtf-run-remote failed. This allows us commit at least log files.
* controller/libexec/dtf-commit-results.in: Do not try to extract
dtf.tar.gz archive if it does not exist (dtf-run-remote fail).
|
|
|
|
|
| |
* share/dtf-controller/results-stats-templates/html.tmpl: Fields
of the result table are now hyper-linked with particular results.
|
|
|
|
|
| |
* controller/bin/dtf-controller.in: Really generate html instead
of xml.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
* controller/bin/dtf-controller.in (subcommand): Generate
stdout and stderr files separately.
(child_task): Generate '*.err' and '*.out' logs for subcommands.
Call dtf-run-remote with --distro/--distro-version options. Call
the dtf-result-stats finally and save its output to results.html.
(main): Simple debugging info and comment adjusting.
* controller/libexec/dtf-commit-results.in: Tak three arguments
now.
* controller/libexec/dtf-result-stats.in: Better read the
'tester/run' output.
* controller/share/dtf-controller/ansible/playbooks/fedora.yml:
Run the 'run --force' instead of 'run' on remote host.
* controller/share/dtf-controller/results-stats-templates/html.tmpl
React on exit_status 2.
|
|
|
|
|
| |
* tester/run: Define 0, 1 and 2 exit status. Also handle SKIP
test-case exit status.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
.. rather than by ansible nova_compute module directly. Allows me
implement more variability in VM handling.
* controller/bin/dtf-get-machine.in: Add --quiet option which
causes that only allocated IP is shown. Add also
DTF_GET_MACHINE_FAKE_IP variable usable for faster debugging; when
set, dtf-get-machine prints its content to standard output without
allocating new VM.
* controller/bin/dtf-run-remote.in: Add -v (verbose) option to
ansible-playbook call to get more verbose output.
* controller/share/dtf-controller/ansible/playbooks/fedora.yml:
Use dtf-get-machine. Also remove creds file requirement.
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Controller is able to read simple YAML configuration file with
list of task to be performed in parallel (the task actually are
run the testsuite remotely, commit results to DB, count
statistics and upload results).
* controller/bin/dtf-controller.in: New template for binary.
* controller/libexec/dtf-commit-results.in: Copy whole result
directory instead of 'dtf' subdir only.
* controller/.gitignore: Ignore new binary.
* controller/Makefile.am: Build dtf-commit-results.
|
|
|
|
|
| |
* tester/run: Handle the "default" return value of 'run()' by
throwing '[ FAIL ]' string to stdout.
|
|
|
|
| |
* postgresql-tests/config.sh: Use office accessible IP.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Define test return values and its calling. Do not split the
testcase into configuration and running script, rather use one
file and wrap the script by run() method. This is still very easy
to run without running whole testsuite.
* tester/run (DTF_RESULT_*): Return values API.
(run): Rename to run_test. Make the function more readable, don't
generate xml results (not yet used anyway).
* postgresql-tests/config.sh: Do not source the per-testsuite
library directly as the configuration script config.h is sourced
even by 'run' script itself for --dist option (for that action we
actually do not need per-testsuite libraries).
* postgresql-tests/tasks/initdb/runtest.sh: New API used.
* postgresql-tests/tasks/initdb_old/runtest.sh: Likewise.
* postgresql-tests/tasks/upgrade-basic/runtest.sh: Likewise.
* postgresql-tests/tasks/upgrade-utf8-syntax/runtest.sh: Likewise.
* postgresql-tests/tasks/upgrade-basic/config.sh: Remove.
* postgresql-tests/tasks/initdb/config.sh: Remove.
* postgresql-tests/tasks/initdb_old/config.sh: Remove.
* postgresql-tests/tasks/upgrade-utf8-syntax/config.sh: Remove.
|
|
|
|
|
|
| |
I had badly configured git so I missed those before.
* controller/.gitignore: Add autoconf/automake related ignores.
|
|
|
|
|
| |
* controller/bin/dtf-get-machine.in: Use $HOME/.dtf/.. rather than
$srcdir/config/...
|
|
|
|
|
|
|
|
| |
The directory structure in this project is done so that you can
run directly from dir (after ./build).
* controller/build: Use --prefix="$(pwd)" instead of --with-git
which was never implemented.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Move 'results_stats' and 'commit_results' binaries into libexec
and adjust appropriately. Also html template is now in
$pkgdatadir.
* .gitignore: Add tags/ChangeLog generated files.
* README: Just some random notes. Needs to be rewritten anyway.
* controller/.gitignore: Add newly 'make'd files.
* controller/Makefile.am: Generate libexec/bin files.
* controller/commit_results: Move to controller/libexec as
dtf-result-stats.in.
* controller/configure.ac: Also substitute resulttemplatedir.
* controller/etc/dtf.conf.d/config.sh.template: The DTF_DATABASE
was misleading - use rather DTF_DATABASE_DEFAULT.
* controller/libexec/dtf-commit-results.in: Moved from
controller/commit_results.
* controller/result_stats: Moved to
controller/libexec/dtf-result-stats.in.
* controller/libexec/dtf-result-stats.in: Moved from
controller/result_stats.
* controller/result_templates/html.tmpl: Moved to
controller/share/dtf-controller/results-stats-templates/html.tmpl.
|
|
|
|
|
|
|
|
|
|
|
|
| |
Rename the config vairable from DTF_OPENSTACK_ID to
DTF_OPENSTACK_DEFAULT_ID to better match the name with its
purpose.
* controller/bin/dtf-run-remote.in: Use DTF_OPENSTACK_DEFAULT_ID
instead of DTF_OPENSTACK_ID.
* controller/config/config.sh.template: Moved.
* controller/etc/dtf.conf.d/config.sh.template: Document renamed
variable on new place.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
First part of converting controller to autoconf/automake solution.
* .gitignore: New gitignore; autotools ignores.
* Makefile.am: New file.
* get_machine: Renamed to template bin/dtf-get-machine.in.
* bin/dtf-get-machine.in: New template based on get_machine.
* run_remote: Renamed to template bin/dtf-run-remote.in.
* bin/dtf-run-remote.in: New binary template from run_remote.
* build: New bootstrap like helper script (git-only).
* configure.ac: New file.
* etc/dtf.sh.in: Likewise.
* ansible_helpers/wait-for-ssh: Renamed to
libexec/dtf-wait-for-ssh.
* share/dtf-controller/parse_credsfile: Reworked script for
parsing OS credentials.
* parse_credsfile: Moved to share/dtf-controller.
* libexec/dtf-wait-for-ssh: Renamed from wait-for-ssh.
* ansible/*: Moved into share/dtf-controller/ansible/*.
* share/dtf-controller/ansible/vars/generated-vars.yml.in: New
template file exporting configure-time variables into playbooks.
|
|
|
|
| |
* get_machine: New option --name and variable $opt_name.
|
|
|
|
|
|
|
| |
.. to reuse remotely generated results and commit them to local
controller-database.
* commit_results: New script.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Turns out that our OpenStack has poor connectivity to Brno so I
added data mirror on one VM on OS1.
* postgresql-tests/lib_pgsql.sh (dtf_postgresql_data_mirror): New
PostgreSQL related API.
* postgresql-tests/config.sh ($dtf_dataurl): Removed PG API
variable.
($dtf_dataurls): Substitution for $dtf_dataurl; Array of possible
data URLs.
* postgresql-tests/tasks/upgrade-basic/runtest.sh: Reuse new PG
API^.
* postgresql-tests/tasks/upgrade-utf8-syntax/runtest.sh: Likewise.
|
|
|
|
|
| |
* controller/parse_credsfile: Detect $srcdir to be able to read
the correct secret file from any CWD.
|
|
|
|
|
|
|
|
|
|
|
|
| |
Make sure that on tester machine everything is put into
$DTF_RESULTDIR. Similarly, on controller machine, everything
should be put into --workdir.
* controller/run_remote: Detect $srcdir.
(workdir_prereq): The $opt_workdir is temporary directory by
default.
* tester/run (run): Task results now go into $DTF_RESULTDIR/tasks.
The main xml result goes into $DTF_RESULT/dtf.xml.
|
|
|
|
|
|
|
|
|
|
|
|
| |
Try to split into three separate components -> controller, tester,
and 'tasks' (postgresql-tasks in our case). The controller
component is the main part which is able to run the task remotely.
Tester is more-like library for 'tasks' component (should be
reusable on the raw git level).
* controller: Almost separated component.
* postgresql-tasks: Likewise.
* tester: Likewise.
|
|
|
|
|
|
| |
.. as the default 180s seems to often not sufficient.
* ansible/fedora.yml: Add wait_for=600 to nova_compute.
|
|
|
|
| |
* tasks/initdb/runtests.sh: Skip this test if performed on f21-.
|
|
|
|
|
|
|
| |
.. because data for x86_64 servers are already generated.
* lib_pgsql.sh (dtf_postgresql_upgrade_matrix): Testing upgrade
from f21 and f22 is now possible.
|
|
|
|
|
| |
* config/config.sh.template: Variable description adjusted.
* config/os/EXAMPLE.sh: Likewise.
|
|
|
|
|
| |
* runner/result_templates/html.tmpl: When particular task results
are not available, print NOT AVAILABLE instead of FAIL.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
For that purpose you may use '--extra-rpms-file FILE' option where
the file contains list of rpms to be installed. The RPMs should
be defined like accessible URLs from testing machine.
Also, export the overall log file as dtf-run.overview.
* run_remote: Add new option --extra-rpms-file.
* ansible/fedora.yml: Include conditionally the
additional-packages playbook. Generate the dtf-run.overview.
* ansible/include/additional-packages.yml: New playbook used to
install explicit list of additional packages.
* ansible/include/download-results.yml: Fix to download also
dtf-run.overview file.
|
|
|
|
|
| |
* lib.sh: Mostly fix double-quoting and back-tick removal.
* lib_pgsql.sh: Likewise.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
When system's locale changed e.g. from en_US.utf8 to en_US.UTF-8,
older PostgreSQL versions were unable to upgrade the data
directory. From Fedora 20 we should be able to upgrade without
issues.
Related: #1007802
* lib_pgsql.sh (dtf_postgresql_cb_upgrade)
(dtf_postgresql_cb_upgrade_select): New callbacks for
dtf_postgresql_upgrade_tour function.
(dtf_postgresql_upgrade_tour): Function determining against which
data the installation is able to upgrade and performs all the
possible upgrade scenarios.
* tasks/upgrade-basic/runtest.sh: Switch to
dtf_postgresql_upgrade_tour usage.
* tasks/upgrade-utf8-syntax/config.sh: New testcase config.
* tasks/upgrade-utf8-syntax/runtest.sh: New testcase.
|
|
|
|
|
|
|
|
|
|
|
|
| |
Based on pre-generated tarball with PostgreSQL data - download the
tarball, unpack and perform 'postgresql-setup upgrade'.
* lib_pgsql.sh (dtf_postgresql_unpack_remote_data): New function.
(dtf_postgresql_upgrade_matrix): New function. Detect which data
should we test against.
* run: Define new global $dtf_dataurl.
* tasks/upgrade-basic/config.sh: New testcase config.
* tasks/upgrade-basic/runtest.sh: New testcase.
|
|
|
|
|
| |
* parse_credsfile: Use export to correctly propagate values which
are parsed.
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Allow automatized generating of data structures on remote host.
Run like ./remote_generate IP_ADDRESS.
* gen-data/dist/dist: New file. Wrapper around tar to package
important files for data generation.
* gen-data/dist/dist.list: Include list for ^.
* gen-data/dist/dist.exclude: Exclude list for ^.
* gen-data/prep: Remote script to prepare everything for
successful ./generate run.
* gen-data/remote_generate: Wrapper running all the above.
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
In F20+ we should be able to deal with upgrades where user (or
system itself) changed the system locale like from 'en_US.utf8' to
'en_US.UTF-8' (which is just a syntax change). Data generated by
this task should help testing this.
* gen-data/tasks/templates/locale-change.sh: Add new template for
locale related data-generation.
* gen-data/tasks/locale-cz/run.sh: Reuse template ^^.
* gen-data/tasks/locale-utf-typo/run.sh: New file for
'en_US.UTF-8' to 'en_US.utf8' switch. Reuse template ^^.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
This data should help with checking of 'postgresql --upgrade'
behavior when the system locale changed significantly (from
en_US.UTF-8 to cs_CZ.utf8).
* gen-data/tasks/locale-cz/run.sh: New task file.
* gen-data/generate (locale_prereq): Fix missing LANG= prefix in
expected locale. Quote current/expected locale strings in error
output.
(single_task): Adjust $INDENT in sub-shell to not affect
subsequent calls. Create tarball in $OUTPUTDIR rather than in
`pwd`. Introduce hook_end callback.
(generate_tasks): Use dynamic list of tasks.
|
|
|
|
|
|
| |
* get_machine: Rework, new options, using parse_credsfile script
to parse configuration etc.
* parse_credsfile: New config-parsing script.
|
|
|
|
|
|
|
|
|
| |
Add also first task 'basic' which generates simple 'pagila'
database.
* generate: New file.
* databases/pagila.sh: New database file.
* tasks/basic/run.sh: New task file.
|
|
|
|
|
|
| |
* controller: Just rsync.
* config/config.sh.template: Document the DTF_PRESENTER_PLACE
option.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Better define configuration and provide examples.
* controller: Unpack results to correct directory, load the
configuration from new place, call run_remote with proper
arguments, generate 'results.html' with result_stats script.
* ansible/run_include: Adjust to better simulate run_remote.
* ansible/fedora.yml: Adjust for fixed configuration.
* run_remote: Likewise. Also small issues with option parsing
fixed.
* config.sh.template: Moved as config/config.sh.template.
* config/config.sh.template: Copyyed from /config.sh.template,
better documented options.
* run: Fix typo - use 'while read i' instead of 'for i in'.
* config/os/EXAMPLE.sh: New file - exmaple configuration.
* private/os/EXAMPLE.yml: Likewise.
* config/hosts.template: Likewise.
* dist.include: New file with file patterns that should be
distributed to test machine.
* dist: Distribute only those files which are necessary.
* config/.gitignore: New gitignore file.
|
|
|
|
|
|
| |
* ansible/fedora.yml: Remove leading dashes from before dict keys.
* ansible/include/download-results.yml: Make sure that trailing
slash is added to fetch destination.
|
|
|
|
|
| |
* run: Double-quote variables, do not use A && B || C construct,
do not use backticks but rather use $().
|