Ad blocker interference detected!
Wikia is a free-to-use site that makes money from advertising. We have a modified experience for viewers using ad blockers
Wikia is not accessible if you’ve made further modifications. Remove the custom ad blocker rule(s) and the page will load as expected.
They Want What? Managing User FeedbackEdit
Notes by AdamW, all errors, omissions and misrepresentations are mine, please feel free to correct any! The Google Docs notes (linked below) have participant information.
they want what?! - todd (slide / google)
there's an art to managing feedback: share feedback and best practices
todd: decide if you're going to have an open or closed feedback process and make that clear from the start. 'black box' (microsoft style) or keeping the feedback open all the way up the chain - a 'contract' type relationship where it's clear what each party will do. provides a way you can go back to developers and remind them of the implied / explicit relationship with the user community.
laurie: oracle's issue is that they are a very large company, with inconsistencies in feedback handling between different products and different customers / customer relationships. voting has not been very effective: popularity contest, can produce inaccurate results and produce excessive expectations.
chelsie: wordpress talk to developer community about what they want in the next release, and look at statistics from wordpress.com to see what end users need. when they roll out new features in point updates - e.g. UI refresh in 3.2 rolled out to wordpress.com 6 months ahead of open source release: look at numbers of tickets related to the feature. every year post on wordpress.org asking 'what do you expect out of wordpress this year?' also get feedback directly from enthusiasts at wordcamps (wordpress-specific conferences). 80/20 rule. no direct democracy.
todd: 'the users really want X', 'the users are really upset because they don't have Y' - emotionally laden language is an attempt to translate a big chunk of feedback, but does not fit well within the organization. organization wants to hear numbers: revenue impact, number of users who want X or Y. emotion vs. data and context. engineers don't know if you have ten total users or 1100 total users, so 'ten users want this' doesn't mean much to the developer. provide data context with your feedback.
chelsie: it's important to have a person or a team that does the feedback mediation role over a long term - adam: develops a trust relationship. chelsie: present a use case that's going to be solved rather than just 'users want feature X'.
jon cruz: interpreting user feedback: users will have an idea of how to achieve what they want but it may not necessarily be the right way. so: discern what it is they actually want and communicate that. chelsie: do live user testing, set up a task list and see how they accomplish it. evan: there's no such thing as a bad idea, just a bad solution to a real problem.
paypal lady #1: problem spaces - have users talk directly to product. chelsie: interesting difference between company structures and community structures. communities tend to have direct lines of communication built in much more naturally than company structures.
laurie: have the developer watch the user struggle: communicate the ux research process directly to the developer. hard to scale this to a huge developer community - how do you have 5000 developers talk to the community? chelsie: at automatic new hires are required to do three weeks of support work - including engineering hires.
paypal lady #1: having a sync interaction has much more of an impact than async. chelsie: giving the developer a responsibility relationship has a big impact (see the support requirement). laurie: telephone game - with some relationships (corporate) the reporter is not the person experiencing the problem - everything gets more indirect.
adam: can corporate structures take cues from communities about interactive and open feedback? richter: often the customer is the one who requests privacy of feedback. solution: accept the private feedback then aggregate + publish it anonymously - 'these are commonly encountered problems' - and request comments.
valorie: kde and amarok both have bugzilla. barrier to entry? creating an account, following the process: some reporters don't want to go through the process. wontfix bugs are frustrating, especially if the recur. chelsie: most end users don't get to the structured feedback. wordpress some developers do go out actively to forums and twitter etc and pull reports into trac. most wontfix come from avid new contributors. adam: fedora / mandriva developers hate forums and wouldn't do that - what's the difference? chelsie: automatic / wordpress emphasize developer community interaction - 'genius bars' at live conferences where developers form panels.
evan: what do you do with minor feature requests? it's valid, everyone agrees it would be useful, but it's not critical. chelsie: maybe someone wants to work on it. laurie: put it back to the community. stats are important - you can run polls and get rough estimates.
mj: feature request thing - they get a lot. they make an iphone app for a low budget company thats creating a feminist calendar, a lot of feature requests are for trying to make it work like the paper version it was based on. finds that a real response from a real person explaining why it hasn't happened yet, and trying to bridge the gap - e.g. 'why isn't it on blackberry?' 'well, we don't have any users on blackberry - can you bring us some?' being open and not defensive. recreate the concern. jon: ask the feedback provider to help out - 'we don't have resoruces for all of this, but if you help us out with some testing maybe we can'.
mj: a lot of the discussion is software - what about non-software feedback? valorie - process problems; a complaint about an issue may be a complaint a process in disguise. adam: turn a bug problem into a process solution. amarok - there was a streaming feature that was often buggy, so they split the streaming features out as plugins / scripts: it provides an easy avenue for people to contribute and separates the script issues from the amarok release process. mj: curate the content that you'd like to see - provide direction but ask for contribution on implementation. valorie: when people make an effort to clear a barrier to entry, thank them - give them a reward. todd: appreciation is a powerful tool, especially if it's genuine and specific.
evan: communicate that you aren't afraid of feedback - ties into mj's point about not being defensive. chelsie: for some end-users, a 'bug reporting system' is a turn-off - make sure there's a process which is more just about 'providing feedback'.
evan: provide a structure to help people provide feedback - help them to help you. chelsie: templates for bug reports.
michael and ed, mediated by evan: both have widely distributed user communities and want to get feedback from them. evan: be as accessible as possible - try and cover every possible avenue of communication that someone might want to get in touch with you over. also broadcast the backchannels to your users in as many ways as you possibly can. valorie: phone apps for feedback? michael: we're a platform not a product, there are a lot of plugins. often the system is used in a lan-based, offline environment. there's usually a developer-type person for each installation, they use them as an intermediary, but they'd like to drill down and get back in touch directly with the users to avoid the telephone game and problem misrepresentation issues. evan: make it clear that you do want feedback from the end users, as many people assume that companies don't want feedback. only geeks assume that people want feedback and go search out a feedback channel, other people assume they don't. michael: people using the system at the lowest end-user level have no idea that there's a big interactive community around the platform. someone: how about writing up a survey and printing it out and distributing it via the intermediary developers. chelsie: try to record why people are doing what they're doing, not just what they're doing: why are they using 8 clicks to do a 3 click job? ask each user community to send one representative in to do in-person usability testing. match that against statistical data.