- Blackberry Bold - me
- iPhone 3GS - wife
- iPhone - daughter
Thursday, December 3, 2009
Wednesday, December 2, 2009
Last night, I attended an agile tools shootout hosted by the aRTP group. We looked at the following tools:
Microsoft Team System
IBM Rational Team Concert
Actually, the demos were given in two separate rooms and I was only able to personally see the Zen, PivotalTracker, ScrumWorks Pro, and Rational demos. From the demos and what I could see from their web sites I would broadly seperate the Microsoft and IBM tools from the rest and put them into the more complex category. However, this is because with both of these tools the vendors are attempting to cover the full development cycle and making sure at detailed design and coding they have things covered. For example, Team Concert was demoed as an Eclipse plugin with source code control, build management, real time notifications, project management features all enabled
Other tools, such as Zen and PivotalTracker tended to be more like electronic 3x5 cards with electronic boards. They did offer the advantage over a manual system of being able to automatically calculate burn down and other statistics.
In the middle of the complexity spectrum were Rally and ScrumWorks Pro because they added more project management features and the ability to integrate with other tools.
So which would I pick? Like any good consultant the answer is "Depends".
It depends on the size and complexity of the organization using the tool.
It depends on the target architecture and technologies used (e.g. one tool I did not classify above is JIRA/Grasshopper which is specifically used for Ruby development)
It depends on the level of contol over tool content needed (e.g. several tools were SaaS with concerns over security)
It depends on the sophistication of the developers.
It depends on the level of formality required of the process (e.g. if federal certification of the software is required then more traceability and reporting will be needed)
It depends on the risk accommodation of the users (Want to go with a small flexible rapidly changing tool/company OR stick with a slowly moving but more stable large vendor)
I did enjoy the exposure to the tools and we will be holding another shootout in the future.
Thursday, November 5, 2009
Saturday, August 8, 2009
This Saturday I attended my first BarCamp. The BarCamp RDU 2009 was hosted by Red Hat at their HQ on the NC State University campus in lovely Raleigh. Approximately 200 people attended and from that audience 36 people were able to present on a diverse set of topics.
- HTML 5 Discussion
- Power Present in 15 Minutes
- Learn how to Juggle
- Secrets of Effective Nomading
- Recommender Systems: Lessons Learned
- Free: Profit Killer, Inevitable, Necessary, or all of the above
- Static on the Line: How to handle feedback
- Palm Pre: Development for noobs
- Potpourri for $500
- Bughouse (a chess variant with two boards and four people)
- Polyphasic Sleep Q&A
- The Intersection of Usability, Accessibility, and SEO
- Building your A Team
- Rapid Return on Investment: Achieve 12 month break even using emerging technologies
- CALEA: Lawful Intercept
- Soft Appliances
- How to do Social Networking when there is no "Network"
- What's up with OpenSocial
- WTF is Biz Dev
- Intro to jQuery
- Alternate JVM Language overview
- Polka! - Triangle Vintage Dance
- When things go horribly wrong
- Which Languages and Technologies will be around in 10 years?
- Productivity of a Submariner
- Google Wave
- Managing the performance of servers in a large network... on the cheap
- Webkit Debugger
- How Smart Startups Win
- The Small Business Web
- Self Publishing Roundtable
- Query optimization in PostgenSQL
- Twitter Roundtable
As you can see this is not your typical technical conference. For example, I was exposed to the community of Polyphasic Sleepers for the first time at this conference.
So here are the talks that I went to:
Free - presented by Martin Smith
Marty led a discussion on the aspects of marketing concerned with the niche markets (ala The Long Tail) and with new business models where content/services are offered for free to the consumer and revenue is generated via ads or with premium services offered to the free subscribers.
From a long tail perspective we discussed some of the benefits of business that operates in that market:
Distribution of Risk - Palm is betting the business on the Pre. If it does not succeed the company will probably not survive. If instead of a single product, a company was able to offer a large number of products to niche markets the risk would be distributed. One commenter mentioned that some products (like cell phones or pharma) require a large production to offset the development expense.
Marty mentioned the increasing complexity of the Internet and recommended NonZero by Robin Wright as good background on how our society is evolving to deal with increasing complexity.
Someone else in the room said that the Long Tail principle applied to more than commercial products. She thought that ideas were also finding small niche groups of people. And those people tended to be more passionate about the idea and more likely to take action in the small group.
Rapid Return on Investment - John Baker
That’s right; I was able to get the new material that Chris Hanebeck and I have been working on in front of this audience as a beta test of concepts.
The basic premise that Chris and I have is that one can find projects in a company that can achieve break even ROI within twelve months. We use a combination of out-of-the-box thinking, emerging technologies, and discovery of analogies solutions from other industries to achieve the results. To get a copy of the material I presented go to my website. A few people drifted out of the room during the presentation and afterwards a participant mentioned to me that the examples I used in the presentation (RFID used in the Supply Chain) was probably not familiar to the audience of BarCamp. I am planning on developing a version that does focus on emerging Internet technologies and will be ready for next year.
What’s up with OpenSocial - Dave Johnson
This one was definitely more technical. Dave presented the basics of OpenSocial and the progress some companies like LinkedIn, Google, Ning, and Yahoo are making using the standard to share data and gadgets associated with social networking. The official site has a wealth of information. And Dave has his own personal blog where he covers OpenSocial and other efforts like BarCamp.
With as many companies investing in OpenSocial it would seem that current problems (e.g. poor security) will be solved.
What Languages and Technologies will be around in 10 Years? - Jeff Terrell
Jeff is graduating from UNC Chapel Hill and wanted to speculate with the audience on what languages/technologies it might make sense to invest time in learning. For example, will Ruby on Rails be around for a long time?
This lead to a diverse discussion on a wide range of topics:
- The language/technology will depend on the solution being developed. COBOL is still being maintained on mainframes in banks while C is common on embedded systems.
- The "browser" based interface is likely to continue grow in ability to support more and more applications.
- The browser based rendering engine is complimented by the continued penetration of always available high bandwidth wireless networks.
- The current keyboard I/O may be replaced by gestures or by voice recognition.
- Augmented Reality will become more common (see Layar )
Google Wave - Joe Gregorio
Joe demoed the Google Wave using a couple other members of the audience to mutate the wavlets being created. He also showed how Robots worked (very cool implications) and how Gadgets (using a semi-OpenSocial structure) can be dropped into a wave.
Google wants Wave to be a true replacement for email (and I suspect a lot more) and therefore is opening up the control of its future to Open Source.
The audience (this was the most popular session I attended during the day) asked a ton of questions. For example, how will Wave work for the user on an airplane (assuming they are disconnected) who continues to work for four hours mutating the wave they left the ground with. What happens when they land and resync? Joe explained that the Operational Transforms would be processed and that the wave would be left in a correct state for all users. Joe said Google does not promise that the results are meaningful, just consistent. So there will need to be some common sense applied to the approach. One idea I have for that is since most documents being created by a team effort would have division of labor it might make sense to add an optional check in / checkout protocol. So just before leaving on my trip, I check out Chapter 4 in the wave and when somebody else tries to touch it that person is told to wait till the material is checked back in for common use.
Google Wave is really cool.
So my Saturday at BarCamp was well worth the time and if you have not experienced one yourself I encourage you to jump on the web and see if one will be happening in your location soon.
Wednesday, August 5, 2009
Lessions learned and corrective actions
- The FPR is conducted when things go wrong. The AAR and Retrospective occur for all outcomes.
- Both the FPR and AAR require the participation of one or more objective reviewers. The Retrospective depends on the team and, sometimes, invited guests.
- The FPR and AAR both use detailed analysis to find root causes. The Retrospectice is more adhoc.
- The AAR assumes that particpants will change behavior based on issues being surfaced, the FPR has a deliverable of lessons learned but no clear followup for change, and the Retrospective has a set of actions with the Scrum Master responsible for seeing they are implemented immediately.
Friday, July 10, 2009
I am a sucker for a "59 Minute Scrum" hosted by Bob Galen. I had participated in one last November as documented in this Blog. Last night the IIBA hosted Bob and I was again a member of a six person team producing a brochure for the Pampered Pooch Day Care. I wanted to get another feel for the team dynamics during the sprints and to compare the results with the last session.
Here are my observations:
1. While the aRTP session was comprised mainly of programmers and the IIBA was comprised of business analysts (duh), there was little difference in the results of producing a brochure. I guess if Bob ran a session at the Pet Care Services Association the outcome might be different.
2. Before we started the Day 2 Sprint, Bob pulled the four Scrum Masters aside and told two of them to go back and emphasize the quality and completeness of the brochure, and told the remain two Scrum Masters to tell there teams to push for as much content as possible in the time remaining. The results were telling. The two teams pushing for Quantity delivered 13 and 8 user stories respectively, and the two focused on Quality delivered 6 and 4 user stories. So teams will listen to the direction of the Scrum Master. Ultimately to have a released product both the Quantity and Quality need to be good enough. So is it better to get lots of 60% quality content in early sprints and then tighten it up all at once towards the end of the iteration? OR do you push for 80% quality content and achieve less content per sprint. A real trade off that the team needs to decide based on coupling/cohesion of the user stories. If the stories have few dependencies then push for the higher quality per sprint. With lots of dependencies you need the total content present to debug and refactor.
Wednesday, June 24, 2009
Thursday, June 18, 2009
- Time to Profitability
- Reduced costs
- Can reduce IT Labor costs by 50%
- Can improve capital utilization by 75%
- Reduce provisioning cycle times from weeks to minutes
- Can reduce end user IT support costs by 40%
Friday, June 5, 2009
Tuesday, April 28, 2009
Last weekend I attended the Deep Agile 2009 conference at Harvard University. This two day conference was put on by Agile Bazzaar, an ACM Chapter dedicated to the improvement of all things agile. Approximately 90 attendees participated in the conference and I thought that this was a very well run production. Kudos to the Agile Bazzaar volunteers and especially Nancy Van
Tuesday, March 3, 2009
As a road warrior I am always interested in anything that will improve my experience in the airport. I recently saw where Delta Airlines and the TSA have teamed up to trial a paperless boarding pass at the Memphis airport. The traveller can download an electronic boarding pass that is displayed on the device. The TSA has a 2D bar code scanner that verifies the boarding pass and Delta uses its current gate scanner. So as a time saver, the time at security and at the gate is not reduced, however I save the time in line at a kiosk to pick up a boarding pass.