Expanding and Refining Manual Application Testing
Discussing options for improving manual testing, metrics, and user feedback for those running the development version of ubuntu
Blueprint information
- Status:
- Started
- Approver:
- Jono Bacon
- Priority:
- Medium
- Drafter:
- None
- Direction:
- Approved
- Assignee:
- Nicholas Skaggs
- Definition:
- Approved
- Series goal:
- Accepted for quantal
- Implementation:
- Started
- Milestone target:
- ubuntu-12.10
- Started by
- Nicholas Skaggs
- Completed by
Whiteboard
-[effenberg0x0] I think what I suggested at https:/
AGENDA:
Review Precise Cycle
How much feedback/testing occurred?
13 calls for testing, 8 different developers/teams, > 100 people
10 testcase contributors, 8 contributors to checkbox-
What was useful / what went well?
Opinions?
We don't want to burnout or ask too much of our testers
We do want to acknowledge and communicate there contributions
Tools
Checkbox
manual app testing was done in precise using it
Moztrap
currently doesn't fit our needs, doesn't offer api to interact with
ISOtracker
plans to modify to allow reporting of results and testcase management; see session
http://
Daily Testing
Delivering new packages and testcases to users on the desktop, simply by running the release
appear via an indicator, ala update manager
offer to test a package
based on hardware?
based on existance of proposed repo?
based on ?
install package automagically from ppa, feed tests and start up app for testing via one-click acceptance :-)
Getting positive testing/feedback loop into the hands of users and developers
Apport
enhancements needed?
Manual trigger of Apport: https:/
Delivering changelogs in a more usable and noticable way
aka, what changed with today's updates?
Proposed Repository
how to best utilize?
Coordinated events
Milestone testing
Specific "calls for testing"
Testing Days
will use isotracker to record
will schedule with skaet
Other Ideas from whiteboard
it would be amazing if developers had some way of indicating what should be tested / stress tested in a package.
[nskaggs] Could we cover this by using a specific testing repository (proposed?) that contains specific packages and versions with the changes (on webpage?)
We should also invest on making LiveTesting work, focus it on critical ones, like Ubiquity, Jockey, etc.
[nskaggs] Agreed, covering idea in the "testing" days idea above
We must offer some interface through which developers can request testing for their packages. [effenberg] Developers could add a testing-
[nskaggs] what do you propose? Currently this is encouraged but in an ad-hoc manner
A special tag, identifying scheduled-
[nskaggs] I am unsure what is meant here [effenberg] was an idea to report test-request results to developers, in a way it doesn't get lost in the flood of reports in LP.
Why not ask Ubuntu users to create a launchpad account when they download a Development Release, auto-add this user to a "Development Release Users" team, and present him with a webpage that points out he can join a testing team and contribute
In general how can we make running the development version more "testing" focused? In your face prompts, apps, etc.
Loose notes
Test case ordering could better ordered so that repetitive cases are grouped together
whoopsie session: http://
* can we aggregate the changelogs out of update-manager into a single view, so that users don't have to click on each package in the list to get an idea of what's changed? apt-listchanges can be configured to send email.
* link out to launchpad to show the diff between the two versions?
* Possible errata releases alongside package updates?
How do you push fixes to -proposed?: https:/
== ACTIONS ==
Work Items
Work items:
[cgregan] provide information about apport-bug to nskaggs: TODO
[popey] follow up on apport refusing to file bugs for packages taken from the official unity ppa: TODO
[kate.stewart] add Testing Days to Release Interlock page and coordinate with nskaggs: DONE
[vorlon] discuss with mpt about adding a button to update-manager to let users see the list of all changes in one go instead of having to drill down into the list; useful to have available in the development release in particular: DONE
[nskaggs] Meet with SRU verification team to discuss resource contention: DONE
[nskaggs] Meet with stgraber to discuss option for translation of testcases (to other languages) to enable wider pool of testers. : DONE
[nskaggs] Talk to apport team to understand plans for apport (how it prompts, when it appears, etc) during the development release: DONE
[nskaggs] Contact AlanBell to ensure accessibility testcases exist and are well-represented for accessibility: DONE
[nskaggs] Meet with stgraber to discuss using isotracker for manual testing events: DONE
[kate.stewart] investigate options into making education available to folks who install CD or when apport comes up?: POSTPONED
Dependency tree
* Blueprints in grey have been implemented.