General updates to countdown list and gantt chart.
Next actions check-in.
We are behind on some of these - what can we do to move forward.
Free-busy work as Preview.1 candidate.
I wanted to get a sense of what's on everyone's mind - what they are worried about now.
Review of Next Actions
Terms of service - Jared still working on this.
Positioning statements - Pieter out sick - will follow-up later.
Product naming - PPD sending a last call proposal to list today.
Domain name - will send out a last call when we have closed on the naming.
Moving the wiki - Ted will have a look at latest proposal for landing page to see if it meets the requirements.
Met last week and closed on the feedback round from the responses on PR list.
Need to close on third option to see where we could go with that.
Iterating on visual treatments.
In 2 weeks we will be ready to present to Mitch - March 12th (mid- March).
Reviewed the wiki areas that people have been working on.
Gather a list of open issues.
Send an email to the list and have owners address issues on the list
Missing some Cosmo stuff - Priss will help with this.
End user content for Preview
PPD will close on this list (via design list). We can then discuss at the next Preview meetings and get volunteers to work on content.
Free-busy stuff - candidate for Preview.1
We need to understand what this involves.
PPD will follow-up to get all the specifics - CalDAV scheduling requirements and implications for Chandler UI.
We haven't created the Preview.1 bin yet but will tag this as an issue to track.
Discussion of Risks/Issues
Wanted to get some feedback on what is worrying people, risks, thoughts on Preview.
EIM and fallout from that - all the work we need to get this wrapped up. We have a lot of new code and there are risks with this.
Performance - we don't have enough people working on performance.
QA bandwidth - all the features coming in at the end and test coverage.
Rolling release date - conflicted about wanting to get it right and gold-plating. We put a whole bunch of hard features late in a release schedule. EIM is coming in really late - too late. Wants more clarity around specific date.
Ability to QA all the new features properly.
Confused about dump and reload - do we need this? After some discussion we realized that there was a disconnect on the list. The solution proposed will satisfy all the use cases. Preview - backup and restore (there isn't anything else). Works cross platforms.
Bear - send an email to the design list to clarify terminology.
Dogfooding feedback - doing this for real. Not enough real world usage.
Agree with Jared - we need to focus in on a date. Uncertainty is hard - wearing on the developers.
Bear * Slipping release date - adding new features to the code base ie: pluggins issue. * Every time we get closer to the major release - developers add stuff, worried that this is a habit when they have extra cycles.
Rolling date issue.
Dogfooding - more people testing the app.
Feature creep in general
Preview pressure - expectations are too high.
Scott's book - attention, pressure - tightening up the process. Need to release close to planned date.
First Preview release - punting on features if we need to do that to make the date.
Would rather punt features than slip more.
Expectations not clear - need well defined bar on quality for Preview.
Dogfooding and testing - sharing bugs. We see recurring issues that can't be reproduced and it's hard to make progress on fixing them.
Main themes in the discussion....
Dogfooding testing, good app coverage. Worried about getting enough testing in general.
General worry about all the new features coming in at the last minute.
We used to have more usable checkpoints but this has been difficult with all the big feature work landing.
Aparna: Checkpoints not up to standard. How do we prioritize this against other developer work? What process do we want to follow. Set these expectations. Do we have developers stop feature work to make checkpoints more stable?
Preview expectations, there seems to be some disagreement on this. What is good enough?
Bugs - too many bugs.
Bugs that block dogfooders.
Features that are there have to be usable - complete.
There are just differing opinions on what it means to be solid.
What is our ultimate goal? Get users - what does this mean?
Do we want people to use the app or try it out for a day and give us feedback then never use it again? Does that quality as success in getting feedback?
Seems that we want more than that. We want people to try and replace what they use today or use this because they don't use anything else today.
We should quantify the expectations more clearly.
There is some worry that the slippage is due to feature creep, extras. Are we really trying to meet our date?
Others want a fixed date that we cut to meet - important internally and externally.
We don't know how long it will take to get to stable. Some feature ie: performance will just be blockers but we don't know exactly how long it will take to get it good enough.
Mimi understands that internally the wear of a rolling release is hard. If we slip another 3-4 mos, in the overall scale it's small. The project has been going on for a long time.
Hard to forecast a fixed date - ixing bugs - getting it stable enough. What if we fix all the bugs we can in 3 weeks and it's not stable.