Improving iso testing process and results
Discussing improvements as to how, when, and why we iso test. Additionally defining and understanding the communities enhanced role in iso testing
Blueprint information
- Status:
- Started
- Approver:
- Kate Stewart
- Priority:
- High
- Drafter:
- None
- Direction:
- Approved
- Assignee:
- Nicholas Skaggs
- Definition:
- Approved
- Series goal:
- Accepted for precise
- Implementation:
- Started
- Milestone target:
- None
- Started by
- Kate Stewart
- Completed by
Whiteboard
- I suggest we think and analyze the Precise Final Report a little: https:/
- The entire lists is extremely focused on Ubiquity. It's clearly the most mentioned word in the document. I see two explanations to that:
1) The installer is in fact broken now for many releases and we already know that. We should focus on fixing or replacing it ASAP. A full code review should take place. There is no need for more bug reports to point that to us.
2) Test cases are hugely focused on installing the ISOs.
- I can't imagine a conceivable number of test cases that would allow us to feel confident about all the potential bugs and breakage of something with the complexity of Ubuntu. ISO testing will never cover that.
- We should admit that ISO-Testing is only an effective way of testing the installation of the many ISOs, and only a few days before every milestone release. It won't ever be more effective than that.
- It would be much more effective to put testers in touch with developers in a task force to focus: Plymouth/Grub, Ubiquity/Wuby, Jockey. That's the basic pilars for a working OS, from ISO to a working desktop. If we see improvements and these three become rock solid, almost all we see in reports of ISO-Testing would be fixed.
- ISO-Testing also takes the focus away of real testing, done by those that run the Development Releases everyday as their main production OS. These are the people that report all bugs of the OS, the ones that re not covered by ISO-Testing.
- See https:/
- I suggest we simply filter Launchpad bugs per package, ran results, read the reports, test packages, create high value-added bug reports, approach developers with the information, get packages reviewed and bugs fixed. Move on the next. One by one, focusing in those with more bug reports. We have more or less 6 months ahead of us. It's conceivable to imagine that with a package-targeted strategy, we can probably get a lot of bugs fixed by working in such way. Once again, I ask: Do we need more bug reports or do we need less bugs? [Effenberg0x0 at Ubuntu.com]
AGENDA:
Currrent ISO testing procedures
provide quick summary of iso testing and results from precise
copied desktop images to server images in the tracker
Ideas for Improving iso testing
better documentation?
document what 'success/failure' means
more tutorials?
screencast? revisit old screencasts.
more personal interaction?
achievements?
isotracker api could be utilized to create "iso testing" achievements
better communication?
nicer tools/process?
Run QRT tests related to applications on the ISOs
Community ISO Testing
How to interact with release team?
Responsibil
Ideas for improvement:
Look into dogtail, like ldtp, from redhat. Uses the accessibility layer so might not be an option for desktop testing Unity. dogtail API at bottom of https:/
Address usability issue where people looking at the ISO tracker think that testing once is sufficient
Areas of improvement:
juju, and all its dependencies like zookeeper, etc.
openstack
tests should be within reasonable timeframes wherever applicable
eg, don't expect a user to run thru 2 hours worth of tests without a break
Work Items
Work items:
[hggdh2] Document expectations for writing a test and adding it to the regression suite for every bug we find and fix. Help implement this process: TODO
[nskaggs] Update documentation and ensure all aspects of isotracker are documented: DONE
[nskaggs] Update all testcases and ensure they make sense and can be executed by a community member: DONE
[nskaggs] Continue 'adopt a iso' program to ensure mandatory tests are being run for each iso: DONE
[nskaggs] Encourage community participation by giving away swag and hardware to people who provide good sustained testing effort: DONE
Dependency tree
* Blueprints in grey have been implemented.