Janet Gregory: Agile Consultant, Trainer, Advisor, Writer, Speaker

In August, I was at the Agile 2013 conference in Nashville Tennessee along with ~1700 people. I attended talks, participated in open space sessions, Lean coffee sessions, and multiple hallway conversations and learned from all of them.

This year I didn’t give a talk but participated in the Stalwart stage which was one of my most interesting experiences at a conference. My topic was: Agile Testing – Bring your problems. I know it was broad, but I was interested to see what kinds of issues people at the conference were experiencing.

How it worked: There was a facilitator who collected questions from the audience (thank you very much to Jason Tice who filled in at the last minute). When he read a question, the ‘asker’ joined me on the nice comfy couch on the stage to give me more context about the question. We had 10 minutes to discuss that question and any issues around it – this was so that we didn’t spend the whole time on one question. I liked the session because it was about ‘real world’ problems – ones that everyday folks were facing on their teams.

By the end, I was wishing that I had someone record the questions and the discussions, but there was a saving grace. One of the participants (Sharon Robson @SMRobson) tweeted some of my responses during the session. I have included them here, which will give you a bit of an idea of how the session went and the kinds of issues that were brought up, along with a bit of explanation when I think it might be necessary. Also, I took a bit of liberty in the tweets by expanding some of the shortcuts – example: perf to performance to try to make it a bit more readable.

I do hope this provokes some conversations in your organization or leave a comment or send me an email if you have questions.

[label style=”info” icon=”twitter”]Tweet:[/label]  Testers don’t need to be able to read code, but they do need to be able to understand their systems!

[label style=”info” icon=”twitter”]Tweet:[/label] To automate testing you really need to understand your system. Can you draw it?

[label style=”success”]Explanation:[/label] To be able to isolate and automate pieces of your system, using mocking or other techniques, you need to understand your system. I issue the challenge to testers, programmers, anyone on the delivery team “Can you draw your system? – even at a high level such as a context diagram:”

[label style=”info” icon=”twitter”]Tweet:[/label] key tester skill = curiosity which turns the team into a “learning” team

[label style=”info” icon=”twitter”]Tweet:[/label] guilds of agile testers in an org are very important. Discuss & build a common language beyond the team

[label style=”success”]Explanation:[/label] Lisa Crispin and I call these ‘communities of practice’ and encourage organizations to use this effective mechanism for sharing learnings and experience between project teams.

[label style=”info” icon=”twitter”]Tweet:[/label] be aware that automated test become “checking” rather than “finding” tests, but they can have high maintenance overhead

[label style=”info” icon=”twitter”]Tweet:[/label] automation by itself should not be a goal, but agile testing needs to have some automation

[label style=”info” icon=”twitter”]Tweet:[/label] continuous deployment changes testing. Has to happen earlier (preventative) rather than later (finding)

[label style=”info” icon=”twitter”]Tweet:[/label] Working software over comprehensive documentation does NOT mean NO documentation

[label style=”info” icon=”twitter”]Tweet:[/label] early engagement of the testers is about preventing defects before the code is written.

[label style=”info” icon=”twitter”]Tweet:[/label] Finding defects is after the code has been written. Manual scripted tests are not good at this stage, explore it

[label style=”info” icon=”twitter”]Tweet:[/label] early make non-functional tests as simple as possible & run them as soon as we can

[label style=”info” icon=”twitter”]Tweet:[/label] talking about sharing the info about performance testing with the entire team. Make it everyone’s focus

[label style=”info” icon=”twitter”]Tweet:[/label] load/performance testing can be more related to features than stories. The earlier the better but still at feature level

[label style=”info” icon=”twitter”]Tweet:[/label] Acceptance tests describe the intent of the story. Accept criteria can be constraints that may move at project level

[label style=”info” icon=”twitter”]Tweet:[/label] lack of knowledge of story intent leads to scope creep

[label style=”info” icon=”twitter”]Response by @Dan North:[/label] s//story/product/ and you have a working definition of scope creep

[label style=”info” icon=”twitter”]My response:[/label] Acceptance tests identify scope of a story helps keep that story “right…

[label style=”info” icon=”twitter”]Tweet:[/label] Just got on the stage with @janetgregoryca to share my blend of the auto pyramid & Brian Marick matrix. Great advice!

[label style=”info” icon=”twitter”]Tweet:[/label] talking about the relationship between the automated pyramid and tasks, stories and features. pic.twitter.com/5IvUUfoCs3

[label style=”success”]Explanation:[/label] See [icon_link icon=”external-link” link=”http://google.com”]http://blog.softed.com/2013/08/26/1883/[/icon_link] for Sharon’s blog post about the discussion.

Facebook
Twitter
LinkedIn
Email

Leave a Reply

Your email address will not be published. Required fields are marked *