Steve Backman's blog

Surviving Browser Compatibility Issues

If you thought browser compatibility issues were on their way out, think again. As more organizations realize the cost savings of using a web application over a traditional desktop application, and more and more software becomes packaged as software as a service, some of the same issues that plagued desktop apps have come back to haunt in the Web world. As installed software gives way to the web services, the complexities of  operating system versions and software updates have faded. On the other hand, as we demand more and more from web applications, arcane browser version issues now surface. 

Why should this be when browsers evolve and improve at an ever faster pace? As of October 2011, Microsoft Internet Explorer’s browser share fell below 50 percent% even as IE9 offers lots of great new features. Firefox has had a breakneck pace of change and yet Google Chrome continues to gain share at about 1 percent a month. And Apple Safari dominates mobile browsing use (“i” users apparently browse way more than Android and other mobile users). Note what the author calls the browser long tail. Its not just the persistence of Microsoft  IE6,  users also now complicate things by  hanging onto older versions of Firefox. Another useful  site for exploring these issues is netmarketshare: you see a lot of the useful content even without a paid, premium account there.
 
Should your organization choose one browser and version, lock it down and prevent updates? Unfortunately this is not a practical strategy. 
 
It is hard to turn off updates. For Chrome users, in case you have been looking, you can’t find it anymore. You can now only do so at the operating system level and not in the browser settings. Firefox users can still do so at Tools -> Options -> Advanced -> Update. Yet, Mozilla, which had reached a popular “stable” Firefox version 3.6 less than two years ago, really wants users to update to version 7, with 8 on its way. 
 
Browser updates address security and performance. They help protect users against on-line threats that go beyond old style email attachment viruses. And we want them to continue tune performance metrics. If we do more and more on-line, we probably need to have more and more browser tabs open at the same time. Those browser tabs gobble up memory and system resources. If you don’t regularly update, you lose out. 
 

Enter HTML5

Browsers continue to evolve also to support ever richer interactive web browsing experiences. When Apple declared war on Adobe Flash on its mobile devices, you might have wondered, what’s the alternative? An emerging alternative for multi-media and other interactive experience focuses on HTML5. HTML5 long term promises standard, streamlined ways for web applications to provide the interactive experience once only possible on the desktop, and now tied in with the web services we want.
 
With HTML 5, we have a new web standard, yet what HTML5 covers continues to evolve and the compliance of browsers and software tools varies. For the curious, here is one place you can go to check how well your browser supports HTML5: http://html5test.com/.  If you think that HTML5 is already an old issue, have some fun on that site. Another useful--and fun--site is  http://caniuse.com 
 
My colleague David Gabbe, who often finds himself mired in the middle of browser compatibility issues for Drupal and Microsoft .Net, encouraged me to write about this issue: “HTML 5 promises much, the change is a big one with some bumps along the way.  But developers must do a good job following CSS and HTML best practices to realize the hype.” Even complete software-as-a-service environments can have browser issues, especially for new rich-browser features and for custom-developed pages. 
 
Spending time checking browser versions and updates will continue to be part of the life of developers for a  while. And developers have to be creative to stay on top. To take one example, though Safari use remains low on both Windows and Mac desktops, we still test sites there. Fixing a CSS or similar compatibility issue in Safari will often prevent it from showing later on an iPad or iPhone. 
 

Installed Software Return as Mobile Apps

 
Will browsers matter less because apps will matter more again—now as installed mobile phone and tablet?  The difference between a mobile-friendly site and a mobile app is also an important one. Creating apps can be a way to regain control compared to the anarchy of browsers and browser versions. David commented, “The rise of the mobile devices --phones and tablets -- has led developers to consider the pros and cons of a web app on a small screen. Mobile devices offer capabilities, such as GPS and bar code scanner via the camera, that  today can only be accessed thru a mobile app.  Yet the plurality of mobile devices places a strain on developers whether they are making an app or web page.”
 
Overall, browser based applications will continue to dominate. An uptodate Drupal site should render just fine on most phones, and that is one good reason among many for using a modern, standards-compliant content management system. Another note: Amazon and other companies may turn to HTML5 web sites as an alternative to mobile apps to  by-pass Apple app store licensing restrictions. 
 

Managing Bookmarks with Multiple Broswers

To end on a practical note. All these issues generally make it harder to exist with just one favorite browser. If a feature on a favorite site suddenly stops working, the culprit could be an update to the site, a security patch to its server software, a browser update, or something else. Unfortunately, this often means shifting back and forth among browsers. 
 
A big challenge with browser hopping is bookmarks. If you only use one browser, you may be able to nicely sync with your mobile. Maybe you find that appealing. I don’t find that I need my whole bookmark repertoire on my cell phone or tablet. What’s more important to me to be able to pick up a bit of reading or research started in one browser and continue it in another, sometimes switching from desktop to mobile, sometimes same machine. How to do this is going to be a matter of taste, and I mostly want to encourage you to believe you can make a multi-browser world work for you. Rather than use a service that just syncs bookmarks, I use a combination of three things that can store bookmarks cross browser and also do more. 
  • Delicious. After a rough time this year, it’s back and improved, and I’m back on it. Browser add-ins will let you book marks sites there instead of using the browser’s bookmarks.
  • Evernote: For project research, Evernote is a great place to save bookmarks as well as whole pages, and to add your own notes and tags to them. The articles I read for this post accumulated in Evernote.
  • And last, for things you want to read today, check out Pulse news for your Android or iOS device. It supercedes a traditional news reader. And it neatly lets you clip an article on your desktop and mark it to read later on your mobile. 

 *   *  *

Like some other posts, getting this out of my system has been therapeutic. It is hard to explain to client project leads why browser testing and updates matter these days. Moving to the web to escape desktop complexity today is not without irony, since we still have to deal with installed desktop (or mobile) environments, this time with regard to the browser. The good news is that the direction is positive and the benefit for users will be the kind of interactive user experience in the browser that we all expect and want today.
 
 

 

Tips for Software Trainings

For an quick, inside look at how Idealware manages so many successful on-line trainings, take a look at Andrea Berry’s post, http://idealware.org/blog/askidealware-how-does-idealware-prepare-seminars. Like Idealware seminars themselves, Andrea is clear and to the point -- focus the subject matter, prepare an outline and organize the content.  Of course, as Andrea pointed out, Idealware does three to five seminars a week. What about the occasional, or even first time presenter, facing a roomful of new users of complex software? Here are some thoughts on preparing for trainings that maybe pick up where Andrea left off. They come from recently helping two members of a project team prepare to lead their first software training ever. 

First a general thought. Even if you haven’t done a lot of training, draw on your own style. Everyone has one. The two people doing the sessions I was helping on are experienced speakers and presenters in their profession, just not in technology training. Whatever makes you good in other circumstances, you should use in training. From there, consider what’s different. 
 

Organizing the Presentation 

It’s tempting to start from the beginning and walk through the menus and pages from end to end. And since you probably know those menus and functions yourself, it’s also tempting to figure, it will be a piece of cake to just log in and walk the new users through it all. 
 
Taking that approach will likely exhaust your audience. And you may also bump into some feature that doesn’t do what you want it to. No matter how much the multi-tasker you consider yourself, there is something about holding the group’s attention while typing on a projected screen that leads to trouble.  
 

Yes to Slides?  

Instead, pick out two or three representative complete scenarios—a case, a project, a client, a grant, etc-- and walk through all the steps you will take  in advance. Whether you need to rehearse exactly what you will say is more a matter of speaking style. At least list out step-by-step all the clicks and entries you will make. Then, whether you have to write out your remarks, you an annotate the steps with essential tips or commentary. 
 
For this, you might find it helpful to make a slide show. For the heart of the demo part of the training, you can make a series of screen shots with clickable links to key live pages. Then, alternative those screen shot slides, with slides with your lessons and tips. (Or put them in the notes or otherwise note them.) 
 
Having the slides also can be insurance against disaster—no Internet access, problems with the software that day, or anything else. Also, like any other presentation, use introductory slides to set goals and expectations. 
 

Like a TV Cooking Demonstration 

Now go one step further. You might take a page from Julia Child and all the cooking shows that have followed. On TV, we often see the beginning steps live in great detail. Then the chef jumps to a view of the finished product to give all the hot tips. In a training, have examples of those same two or three scenarios all set up in reserve. You can start with live entry and then jump to commenting on a completed example at any point. 
 
A variation. Instead of walking through a small number of situations in detail, it sometimes works for me to think of, say, a dozen top daily or periodic tasks staff will face. They can be in order of likelihood instead of order of entry. Do the training as a series of "how to" mini tutorials, instead of an end to end walk through of the whole system. Could this come across as disjointed? Sure. Yet is could also come across more the way we use good web help, jumping first to the Frequently Asked Questions section and seeing if we can find quick lessons there.  
 
If you take this approach, you can list the mini lessons as topics on an easel pad sheet (or a slide page you can return to), and check them off as you go through them. Suppose each one takes 10 minutes (with questions), that's a real framework for watching the time. Checking each one off will help your patient participants mete out their own attention. And if you do them in priority order, even if you have to adjust your time, you will know you have hit the most important things.
 
By the way, one of the best software manuals I have used, for the somewhat obscure Maptitude desktop GIS software, works really well in this mini-lesson format. Each new sections starts with one or more 60 second tutorials. Sometimes that’s all you need, without reading the detailed reference.
 
If you have the luxury of two presenters, it really makes sense that while one person talks, the other works the screens. You can switch back and forth, yet avoid having the same person to have one person doing most of the talking 4. Consider, one of you talking and one using the computer--shift back and forth, but avoid the speaking and clicking at same time.   
 

Interactive Doesn't Have to Mean Hands-On 

Hands on training is often a luxury trainers can afford: there isn't enough time on the schedule; the room is not set-up for it; the participants are not ready yet for that.  You can make your session highliy interactive even without the hands-on, and be all the more succesfful. 

How can you anticipate those investable things that don’t go right when presenting software details in front of a crowd? One idea is to put up  easel pad pages with headings something like: "suggestions for improvements" and "training follow-ups." Then you can note trouble spots or questions that mire things down, and just move on.  By putting up the sheets in advance, you signal both openness to ideas as well as insistence that you have a schedule and  not everything can get settled during the training.

It can help to give a long training a pacing and rhythm. Don’t wait until the end for discussion phases. Here are three important back-and-forth elements you ought to cover, and you can use as natural change of pace topics within the overall outline.
 
First, address fears. Users of new software will bring fears and anxieties to the training. Those issues will distract them, so its much better to take them up along the way. Will the system increase my workload? Can I still get the same lists I used to get, however clunky the process? 
 
Second, making time for follow-up. Staff who have a day off for training may have more work to come back to the day or two afterwards. It really can make a difference in internalization of the learning to make time in advance to try out the system. It helps to give participants time to talk about their plans to put the training to use, maybe finding a learning buddy and making a date, thinking about what cases or clients to try it out on and what paper work will be needed in advance to focus on using the new system, and other practical steps..
 
Third, address security and privacy concerns staff may have with the new system. If the new system is more comprehensive in what it collects and measures, best to talk through issues staff may have. 
 
One last audio visual aid: have yet another easel pad sheet with top tips to remember. Even if you have them on slides, it can help to write them on a fresh list based on reaction in the room as you go through things. If the wording partly reflects comments on mood in the room as you do the training, it will make them more memorable. You can send them out as a follow-up.
 
I hope these points help. Unlike Andrea, I speak as someone who doesn’t do training every day. And I often have anxiety going into trainings especially where it’s on software I have helped select or configure. Basic iIdeas like these help me get through, and I hope for you as well. 
 

 

Out from Under Too Much Data

Shelley Podolny’s March 12 New York Times column, “The Digital Pileup,” started me thinking.  “Because electronic information seems invisible, we underestimate the resources it takes to keep it all alive.” Podolny reports global data usage of 1.2 zettabytes (a lot of gigabytes). For the US alone, 3% of the national power supply supports “server farms,” the giant data centers with aisles and aisles of servers. 
 
Three per cent may not seem like a lot. As the world reels from one natural disaster to another, at least some of which have to be connected to rising energy consumption, it adds up. On other occasions, I have commented on how the move to cloud computing—Salesforce, Google, and so on—reduces energy usage. It concentrates resources. Hmm. Maybe it only makes power demands more manageable against the rate of growth of data storage. 
 
Podolny suggests that by 2020, “the [information] volume  will be 44 times greater than it was in 2009. There finally may be, in fact, T.M.I.” It’s that last comment that prompted this reflection. My job often involves helping organizations collect more data—facilitating better contact management and data tracking, enabling and encouraging richer web content, and so on. For sure, planning involves balancing need against capacity. Yet the main trend is more. 
 
In this context, Podolny says, “Despite the conveniences our online lives provide, we end up being buried by data at home and at work. An overabundance of data makes important things harder to find and impedes good decision-making. Efficiency withers as we struggle to find and manage the information we need to do our jobs. Estimates abound on how much productivity is lost because of information overload, but all of them are in the hundreds of millions of dollars yearly.” 
 
Even as organizations trend toward collecting more, better data, we need to regularly ask when we have too much.  Some thoughts.

Taking the Time to Focus on What We Need

First, data needs regular cleansing. And no one is immune. Just this week, we purged 30% of the contacts in a key internal system. They had been mingled in when we upgraded from an older system, marked as inactive and ignored. Yet however dormant, it turns out that gaps compared to current needs hindered synchronization with our time tracking system. For a year, it has added additional monthly steps.  
 
When you bulk load old data into new systems, sometimes rules of data integrity and validation get by-passed to shoehorn old data in. You may use Demand Tools, the Data Loader or other powerful data manipulation tools to get started with Salesforce. Yet you (or more likely we) may be tempted to leave some new data rules turned off until mounds of historical data find their new home.  
 
Lesson one: Data storage may cost less, but still bears an organizational price. Better to invest the time up front, and then through periodic reviews, of old, data clutter. 
 
Second, while some still wrestle with getting more news up on their website, too much content holds others back. It’s really the same story for content as with data.  Has your migration to  Drupal or other modern system gotten pushed back and back because of the weight of old site content? Less frequently used pages still may need tons of attention  to realign them with new site navigation and search tools.
 
Some possible lessons here about large web projects: Focus first on the 30, 40, 50 pages, or whatever the number, that get used the  most. Those probably need to be rewritten anyhow. Leave the 80-90% of the pages that get used only 10 or 20% of the time in their old format until they can be pared down, reindexed if still needed, or archived into some other format. 
 
Another lesson: If you see a new website coming, make sure now you have good, usable analytics on the current site. When it comes times to begin active planning, you want to know which pages and resource get used the most.  You also want to know which things visitors search for the most, may exist, but don’t get accessed, suffering under the dead weight of the total information past. 

Even Good Back-ups Proliferate Data Overload

Third, in a time where we need to pay more attention to back-up policies, we can’t neglect duplicative and redundant data storage. We understand this better about social  media. Once a photo posts, copies may proliferate very quickly, both by users and by the systems they sit on. Podolny estimates that 70% of information storage today is generated by individual use, and that 75% of all data storage is duplicative. 
 
Lessons here: We need to make it easier for staff and constituents people to forward links to content and use it on line, and not just attach full document copies to group emails. As for formal back-up, the storage for a few hundred web pages, documents and multimedia needs to be multiplied by how many back-up copies exist. A “good” back-up policy many mean multiple copies in many places. Organizational email may also be multiply backed up, consuming gigabytes of storage. The lesson here is, make sure you do have redundant back-ups, yet also make sure that they haven’t grown so large and unwieldy as to be impractical when you need them. 
 
Fourth, with data back-up especially, we need to be sure of our confidential information. A client finally gave up on using social security number as a convenient contact search index. Removing the data point from active use is one thing. Now comes the hard part: finding all those back-ups, including ones in inconvenient archival formats, that still pose a data security risk. 
 
As Podolny points out, “In the corporate realm, companies stockpile data because keeping it seems easier than figuring out what they can delete. This behavior has hidden costs and creates risks of security and privacy breaches as data goes rogue.” Data security laws, such as the new ones in Massachusetts, apply equally to nonprofit organizations as to private businesses. 
 
Polodny’s call to action makes sense: “We can live a productive digital life without hoarding information. As stockholders and consumers, we can demand that our companies and service providers aggressively engage in data-reduction strategies. We can clean up the stockpiles of dead data that live around us, be wiser data consumers, text less and talk more. We can try hitting delete more often.”
 
The overall message: whether for the environment or our sanity, even though technology is giving us “more” digital data, less may be more.
 
 

 

Cloud Security in the Era of WikiLeaks

Salesforce helped pioneer the concept of putting confidential organizational data in a "public cloud" system.  Other key vendors offering public cloud data services include Microsoft Azure and Amazon S3 services. "Cloud" has come to mean  many things to many people as far as putting internal office functions up on the Internet. The word “public” is important to understand. This means that all data--every corporate and nonprofit user--sits in one enormous database. This is in fact a reassurance, not a drawback. In a public cloud web service, the vendor focuses its security attentions on one system, top to bottom. 

 

Public Clouds and the Apartment Building Analogy

 
The data system in a public cloud  is "multi-tenant"--tenant as in  big apartment building. Even more than in a condo development, an apartment building provides uniform infrastructure supports distributed across all residents. Apartment building tenants in a well-run building can still personalize a unit yet off-load most all concerns for maintenance and infrastructure. Salesforce, for example, manages its 100,000 customer accounts with something like 20,000 servers. These servers are located in multiple data centers worldwide each with thousands of servers. A typical hosted website may be secured by the Internet Services Provider, yet it resides in one physical place at a time. In a public cloud system, your data is never just on one server in one data center. There is immense redundancy and optimization so you never need to think about where things are stored and where the next page view is coming from. Likewise, you don't have to think about doing a back-up. It just happens. 
 
While it still strikes many as challenging to use a public cloud service for storing confidential donor or client lists, we all trust public cloud environments daily. When you use GMail,  Google Docs, Facebook, Flickr, Twitter, you trust your data to a public cloud. In each of these, your data sits, personal or organizational, sits with everyone else’s in one place. For that matter, specialized email manager services like Constant Contact, on-line donation systems, BaseCamp also are public clouds of sorts. Though you only see your own projects, Basecamp or Central Desktop project information sits in the same database as thousands of other customers.
 
In a cloud environment, everyone logs into one place. Once you log in, you are directed into a compartmentalized set of functions and data and can never see anyone else's.  The privacy is covered by the user name (email address used to log in) and password. It is up to the vendor to make sure other people’s data doesn’t leak through to your account. What they gain, again, is the ability to manage all security and performance issues in one place. 
 
As with modern network managers, cloud systems typically enable an organizational systems administrator to further govern the user list: you can restrict usage by the location (IP address) a user can use, time of day, and other rules. Cloud environments should make these rules --as well as the log in history of each user--visible to the system administrator(s) of the account (including you) at all times. The systems also encourage robust passwords and resetting them periodically. 
 
When someone leaves the company/organization, the administrator should be able to simply deactivate their user name . This immediately locks them out. Deactivated users in some system can remain in your account without counting against licensing. This means that records of work and activity remain even after the user is locked out of the system. A bulk transfer of record ownership from a user who has left to a new user is a handy feature to use once the user is de-activated.
 
Add-on tools to cloud services also need to be secure. Sometimes people ask me if they can “see” their cloud data directly using ODBC, as they might have done with MySQL or SQL Server based traditional websites.  In order to hold things together securely, public cloud services typically ensure that add-ons for email, event registration, as well as your own custom pages use an additional layer of security beyond user name and password. Salesforce, for example, adds a long, apparently random "security token" to encrypted data transfers by add-on tools or pages. 
 

How much security responsibility do you want to have in the era of WIkiLeaks?

 
Another element of cloud-system security has to do with browser standardization. When you create or update your public website, you can still opt to have the developer support older, unsecure browsers like Internet Explorer 6. The developer won’t be happy, but it can be done. Cloud systems typically set standards for secure browser connections, which may frustrate users on older computer that haven’t updated, yet it adds protection and ensures encryption to the organizational data. 
 
Cloud environments also enable centralized treatment. Every week there are new threats to computer systems. These threats could potentially affect cloud systems, but the difference is, it is cloud vendor’s job to make the patches, and when they do, they do it in one play for all customers. Everywhere else, you have to have some concern when patches will be applied, who will do it, and whether it will affect your customizations. Cloud vendors like Salesforce focus on pre-announced major upgrade cycles annually and roll them out in organized fashion, while making security patches regularly. 
 
A different security issue concerns how much can we trust a cloud vendor to maintain the confidentiality and privacy of data you entrust to them. This is a consideration with any software any of us uses these days. We live in an era of WikiLeaks, apparent semi-official Chinese theft of Google and Adobe data, Israeli attacks on Iranian nuclear power system networks, and newly intrusive data mining by US authorities. Yes, you do have to pick who you can trust. An organization that was counselling undocumented immigrants, pregnant teens or anything else confidential needs to consider what their network provider, hosting company or cloud service would do in the event of a government warrant or determined attack by a political or organizational opponent. In a cloud environment, you have to select vendors you can trust based on their size, history with privacy incidents, and leadership and board commitments. Given that we expect more of our desktop or network server software to “check for security updates” regularly, to be fair, we need to make the same determination for all software, not just cloud systems. 
 

Private Clouds

 
In general, using a public cloud environment is like using the electrical or water utility: each month, you use as much or as little as you need, and you pay as you go. Technology managers who speak in terms of "Enterprise" class services also, among other things, seek environments where your software applications are always available,  and the infrastructure can handle any size load reliably.  The various forms of cloud technology make this possible and the birth of public cloud providers makes this economically feasible for small organizations.
 
As I mentioned cloud can mean different things. These days, everyone wants to say they have a "cloud" offering. We also use and have created web applications using Drupal or .Net  that might be called "private cloud" systems. A private cloud does give an organization more unilateral control over its data yet it also leaves more responsibility in the hands of the developer or support provider. You may use a shared web infrastructure which off-loads some responsibilities, while leaving other issues of integrity, up-time, and some of the security features depend on the developer. 
 
Web sites may be hosted on a server that has dozens or more separate sites. They share some things on the physical server and its surrounding network at your Internet Service Provider. They use virtual environments to confine your view to what looks like an entire isolated web server. If you do something bad, your site will go down and no one else’s. While that sounds good, it also means you have more responsibility to make sure that your site doesn't do anything "bad." It is a mixed blessing, and as software gets more complex, a growing burden.
 

What about traditional networked databases and software?

A private cloud is  more reliable than an internal networked database. A networked database (eg, Raisers Edge or Sage) running on Windows server leaves you with complete responsibility for all aspects of the integrity, security, back-up, updates and so on of the software. Larger organizations feel comfortable doing this and they pay for it, either to their network admin team, to software vendor, or a consultant. Smaller organizations cut corners and may be vulnerable.  Yes, you can isolate things by virtue of your network firewall and server security software. One might debate in 2011 whether this is more secure than cloud environments. On the whole, I no longer think it is. Network administrators and server software vendors are fighting at best a stalemate keeping malware outside the door.  Malware on an individual, inadequately protected computer on an organizational network can undermine the entire network.

 
As far as the reliability and responsiveness of the database--a different kind of security: Since in a public cloud database or network environment, everyone uses the same underlying data store, this means that a small nonprofit  with 10-20 users and records in the tens of thousands gets the same treatment as a giant entity with hundreds of users and millions of records.  In an environment you maintain yourself, you have to be concerned with whether the set up of the database (including Access or Microsoft SQL Server) can support the amount of data and number of users. Anticipating growth and scalability in a private or networked system is a big headache. 
 
Organizations that host their own software applications internally have a basic conundrum. The cost of network hardware has dropped considerably. Standard networking software can either free (Linux) or, for nonprofits, subsidized (Microsoft on techsoup). Yet those reduced base costs mask a continuing high cost of maintaining a server year by year. 
 
The conundrum comes from bearing the entire cost of maintaining that server yet most likely hardly ever using its entire capacity. Until a few years ago, organizations found themselves adding multiple servers to separate file and print services from email management, application and database servers and such.   Virtualization technology, a fundamental technology for cloud computing, has allowed the consolidation of servers. This provides a noticeable economic savings as well as environmental benefits.  The simplest analogy is switching from incandescent bulbs to compact florescents or LEDs. This partly mitigates the responsibilities for maintaining local servers, but only partly. 
 

Broadening Safety in 2011

Along with physical security of a local server are issues of data security. Data security is multi-layered and multifaceted.  Here in Massachusetts, the state enacted laws to protect personally identifiable information.  This changes the security model considerably because organizations, including nonprofits, now need to proactively take responsibility for protecting against internal inappropriate data access.  Sensitive data now needs to be encrypted in the data base and users granted specific permission to access that data.  If your organization must provide a security compliance audit, some say that public clouds with their proprietary technology may not pass scrutiny as they don't reveal the internal workings.  At least for now, this is a major rationale in the corporate world for “private cloud” environments, where you get some of the benefits of hosted cloud infrastructure, yet take responsibility for everything you have up there. On the other hand, organizations with their own local servers may lag behind in documenting and keeping up to date a security compliance plan for their network and server infrastructure. 
 
Below private clouds and networked systems lies the security of individual computers and now mobile devices. Here the security picture is even bleaker, with inadequate protection and frequent vulnerabilities. When someone says, they will install their accounting system or other critical data on just one computer and keep it off the network, to be safe, you can ask, how much effort are they putting into back-up, anti-virus, hardware maintenance and all the rest for that one computer? How much does a systems administrator at that organization monitor the complete set-up of that confidential computer? 
 
For a sense of emerging security issues we will likely face in 2011, check out this article from the great technical security resource, http://darkreading.com. 
 
From my point of view, 2011 looks to be the year that no matter your application platform, there is an affordable, secure cloud solution for  you.  Whether you start with simple server virtualization or move to a more complex cloud solution, it’s time to do something about that server in the closet.  From a security point of view, the old internal server looks less like a private fortress than a single point of failure.  Some would call it a “stress box.” The burdens of maintaining private systems have become so great, developers and software publishers feel increasing pressure to move in this direction in order to survive and give their customers good service. Infrastructure specialization is a good thing - it enables organizations to focus on their core mission; reduce operating costs, handle unexpected contingencies, and contribute to a greener world. My sense is that over the next several years, more and more corporate systems will move to the cloud, and to the public cloud, and in their wake, so will nonprofit data. 
 
 
 

 

A fresh look at Dragon voice recognition software

 

Something different about this blog entry then my usual ones is that I'm dictating it instead of typing it.  Every so often my mouse hand and wrist bothers me for a week or two. The last time this happened, I decided to get a copy of Dragon Naturally Speaking software and try it out. Coincidentally, a couple weeks ago, in the middle of all this miserable snow we’re having in the Northeast, a friend and colleague tore a tendon in in his elbow. I convinced him to try out the software. He needs to write all the time, and he has been using it almost exclusively and successfully. This in turn got me using it again. I have to say--it works.
 
Installing it was much easier than I thought. After running a standard installer,  I went  through a pretty easy and intuitive start up process. I spent about ten minutes to read in a few sample passages to let the software get used to my voice.  
 
 
And then I  just get started. The software pretty much keeps up with you as I  speak, and it is indeed very accurate. I  can intersperse you dictation with intuitive voice commands, such as “comma,” “period,” “scratch that,” “new paragraph”, “bold this,” and so on.  The commands are intuitive, and there is a pop-up tip guide.
 
There are some things to get used to.  I had to fight the tendency to say one word at a time and wait for it to show up on the screen. Instead, the software listens for the phrase or sentence, analyzes it and puts the whole thing on the screen. I found that part pretty slick:  In the start up process, you let the software analyze all your documents mathematically for frequently used words and phrases, it can process dictation in whole phrases and sentences. It not only keeps up better, it also is less likely to mix up words in English that sounds alike but have different spellings and meanings. Going too slow and staring at the screen  not only is distracting from the writing, but also undermines the accuracy of the software. Trusting the software  takes some getting used to. Once you do, the software is useful and even fun
 
The big pluses are first, it's hands-free, which means that you can give your wrists and arms a rest, and sooner or later we all need to do that. If you’re  not the greatest typist, it's definitely got a be faster than typing. Though I can type pretty fast, my friend with the tendon injury says that he is now writing a lot quicker than when he was typing, and with fewer typos. The manufacturer (Nuance) claims close to 100% accuracy, and that has been my impression. 
 
Drawbacks are that there are a lot of different versions and it's not the cheapest. Windows 7 has built-in voice recognition for dictation and it apparently works for basic stuff. I suspect that you get what you pay for. Dragon has been around a really long time, and has put a lot of effort into optimizing this product. If you shop around, you can get the home edition for well under $100. As with other things, the Professional Edition adds more features. And for such things as the esteemed right to use a Bluetooth wireless headset, you will surely pay a big premium. But for your work, you may be fine with the basic version. And you certainly do not need a really fancy mic. I’m using it tonight with my standard $30-$40 Plantronics USB DSP400 headset, nothing exotic or special -- the same one I would use for Skype. You can look at the version comparisions on the Nuance site, http://www.nuance.com/, and then shop around. 
 
Another possible drawback is that, even though you really don’t have to go slow, you do have to speak precisely. You have to pay a special attention and starting and ending sentences. And if you work in a busy office and your writing is subject to interruption, you have to be ready to give the voice command to stop the dictation if someone comes over and asks you a question or you answer the phone. That can take some getting used to. These things mean that at least in the beginning, more causal writing, such as answering emails, may work more “naturally” for you than writing something more formal. But that's about it. 
 
Also, you need to be aware of the software resource intensiveness. When Naturally Speaking checks your computer initially, it lets you know how ready your system is to keep up. My laptop has 4 GB of RAM and it clearly needs it.  In order for it to do all this complex recognition, it's using up resources even faster than the video or graphics editing software package. 
 
Still, in sum, for either your health or to speed up your typing, Dragon could be a great new utility. I’m going to keep using it.
 
 

 

"Minority Report" and the Future of the News

 My New Year’s reflection on what comes next included watching the 2002 movie Minority Report again. I found myself wondering why this ten year old sci-fi thriller gets shown so much on TV. The once very cool crime computer gadgetry now seems all but mainstream. I suspect that instead, the movie resonates with things we see coming as far as the future of the “news.” There are things we partly want as far as social media, and partly fear as far as corporate and government information gathering and management. Though it doesn’t picture the world of today, Minority Report anticipates some of the dilemmas of today. If you are trying to imagine where the information empires represented by Google, Facebook, along with government intelligence gathering and more are heading, you could do worse than watch the film. Then try to balance its vision against what the rest of us ordinary bloggers, policy advocates, and more will be able to do.

As far as the technology in the movie, in ten years, the computer interface in Steven Spielberg’s film has become commonplace. Back then, watching Tom Cruise wave his hands around to zoom in/zoom out and otherwise manipulate data was all new. Now we see such things all day long on CNN. Even local weather reports now have lots of hand-waving at computer screens. And we also have multi-touch cell phone, iPads and more. And Microsoft’s Xbox Kinect gets us even closer. The computer stuff in Minority Report starts to look quaint, particularly with the heavy classical music background. 
 
Instead, it is Minority Report’s concept of “Precrime” that strikes a raw nerve. Computers barely mattered in everyday life when Dick wrote the story. When Spielberg made the movie, Google didn’t yet stand out and Facebook and Twitter didn’t exist. Yet Minority Report featured a total information network for crime news. Tom Cruise / John Anderton’s police Precrime unit use cutting-edge technology to see and respond to violent crime before it happens, reducing the murder rate to zero. The movie features eerie media coverage of crimes that didn’t happen as opposed to covering them after the fact. 
 
Minority Report’s Precrime ultimately relies on three genetically engineered human psychics to report future news.  Substitute social media for Minority Report’s team of psychics and you have something like what is happening today. Over the next year or so, we will see the growing contention between and perhaps convergence of two major computing trends—search and social media.  Google does something like a billion searches a day, and Facebook has something like a half billion members. What we collectively search for affects what is the news. And all those tweets and facebook “likes” all over the traditional media sites also affect what is the news today. We can’t yet literally see what will happen an hour from now, but we can have a pretty good idea of what will the “news” an hour from now. In watching the psychics at work, and especially as one emerges with her own personality, you feel a bit the generalized, aggregated effect of social media. 
 
For communications staff and on-line advocacy organizers, these tools represent an opportunity. Key technologies are democratized--free services or open source tools. Almost any sized organization can put effort into search engine optimization or into trending messages on social media to be more part of the news. 
 
Yet in controlling the message, we play against a stacked deck where corporate communications staff have much more resources, even if they don’t have genetically engineered psychics. 
 
And the media empires offer these tools “free” in order to collect ever more information for their advertisers about us. That’s the other prescient element of Minority Report. Minority Report visualizes a world where we are surrounded by deeply personalized advertising. Corporate entities in the film have totalized databases of people and their histories. Computer systems scan your eyes, index them  into a database, and use to deliver the right advertising messaging. Watch the mall scene here: http://www.youtube.com/watch?v=oBaiKsYUdvg. The idea that corporations and the government have systems that know more about us than is realistically humanly possible is kind of scary.
 
This is the trajectory of Google, page visit tracking, and all the rest. We can read free services’ privacy policies (here’s the one for Chrome http://www.google.com/chrome/intl/en/privacy.html). Most people don’t read them, and even if you do, do we really know how much privacy we really have? 
 
If we needed a reminder, we now know from Wikileaks, nothing today can be kept truly private, confidential. And even as everyone worried about airport scanning this fall, the real trend in domestic security is going to be more government information gathering, analysis and profiling.  I would say that what keeps Minority Report current is our wonder where will all this corporate and government information gathering will end and how much our own efforts to control the message can balance it. 
 
So, reflection leads to my resolution for 2011: do all we can to balance the oppressive, subliminally distorting, privacy-stealing control side of communication technology with greater efforts to keep usefully personalized tools working for the social good. 
  

Round X: Do you or does Facebook own your Facebook data?

 More and more web sites allow you to sign in with your Facebook ID. Not a good idea if you are just doing it to avoid remembering passwords. Better to sign-up for lastpass.com or something like it. Interesting idea if you truly want to share what you do on the other site with your Facebook circle. 

In the be careful what you ask for department, issues of privacy, openness of data exchange and reciprocity among web services continue to stick to Facebook.  Michael Arrington’s 11/9 post on Techcrunch.com, the major watcher of all things high tech and start-up, takes a fairly strident stand against Facebook’s current policies. /

Facebook has become not just another social network but an institution of our times, complete with a Hollywood version.  As it grows, its economic power grows, a lot by gobbling up ever larger mounds of personal data and web browsing habits.  When you log in to another site using your Facebook ID, a new stream of data flows outward from that site to Facebook. Behind the scenes, your data on Facebook cooks with all your other data to produce ever more complete portraits of your habits and the social cohorts you are assigned to. This gives Facebook more marketing power to advertisers and marketers.
 
Facebook’s lack of reciprocity about data bothers a lot of people. It bothers Google and other large presences on the web. It bothers media and technology critics. It bothers Facebook users even if they don’t quite make the leap from technological limitation to corporate policy. 
 
The latest conflict, between Google and Facebook over user identities is illuminating.  Google announced a policy that it would only allow other web servers to grab user lists if those web services provided the same services back. This effectively blocked Facebook. Facebook retaliated by publishing the manual steps to work around this restriction. And a  war of words on this heightened. 
 
Facebook’s policy hinges on the idea that you should be able to get at your data, but not data about your friends. So if you are a friend on Facebook, you may export that bit of information, but not other stuff—notably not the email addresses of your friends. Reasonable in some ways, but in practice, users become tied to Facebook. This is true for individuals, for organizations and advocacy causes and for businesses.  There are ways around everything, but not on an easy, free or grand scale.
 
Bottom line: While Facebook claims you have control over your own data, control is not ownership. You can only do as much as their tools allow. And their competitive market position governs what they allow more than deep and abiding concern for their users’ long term independence and privacy.  
Arrington argues that Facebook needs to give individuals and organizations greater ability to extract their data. Likewise, he says that other web services greater ability to draw on a user’s Facebook social graph, the sum total of their relationships. 
 
Read the whole post, but his summary makes the basic points: 
1. It’s what users want, and it’s the right thing to do.
2. Facebook is so large now that health-of-ecosystem and user needs must be considered when Facebook makes product and policy decisions.
3. They’re lying to press and users, even today, about their motivations for retaining data. This is not about protecting users.
4. The data export tool they released last month is a red herring.
5. They have a very small window of opportunity to do this, before Attorneys General and class action litigators see too big of an opportunity to pass up.
 
Whatever you think on the substance of the issues,  anyone who works with technology has some social responsibility to break these issues down for those we work with and influence and to speak up on the consumer—as well as potential regulatory issues involved. 
 

Will the Revolution Be Tweeted? Striking the Right Balance with Social Media

I just took in The Social Network movie and Malcolm Gladwell’s October 4 New Yorker article, "Small Change: Why the Revolution Will not be Tweeted" for a double feature  of social media caution. 

 
The Social Network was definitely engaging. But if you didn’t know it before, Frank Rich nailed it:  “You leave the movie with the sinking feeling that the democratic utopia breathlessly promised by Facebook and its Web brethren is already gone with the wind” (10/9)  (“Facebook Politicians Are Not Your Friends “ )
 
And Gladwell wasn’t much fun either. He contrasts the traditional grass roots organizing of the early Civil Rights era with today’s Internet-charged activism.  He praises the early 60s sit-ins as reflecting seriousness, fearlessness, commitment and tighter organization. There is a lot to learn from that era. You can go back to first-hand accounts like the late Howard Zinn’s 1964  SNCC: The New Abolitionists or check out  many more recent studies. 
 
My own sense is that the Student Nonviolent Coordinating Committee and other initiatives had more of a mix of organization and creative spontaneity than Gladwell allows. By painting an over-simplified picture of the past, Gladwell tends to be overly dismissive of how we use on-line  services today. 
 
Among his other targets, Gladwell takes on Clay Shirky (Here Comes Everybody). Gladwell chastises Shirky  for shifting from “organizations that promote strategic and disciplined activity and toward those which promote resilience and adaptability.” Well, maybe, to some extent.  Endless cause-based activism just can just lead to …more cause-based activity. (For more on this topic, check out The Revolution Will Not Be Funded: Beyond the Non-Profit Industrial Complex, by INCITE! Women of Color Against Violence )
 
Yet as so many nonprofits and advocacy groups have seen, the social media offer useful, low cost, low risk tools for organizing and activism. Yes, they cannot substitute for long-term planning, formulation of strategy, and organizational strength.  Shirky doesn’t say that the social media are those things. I love his theme that the new communication tools reduce the friction of coordinating social change. They reduce the barrier of getting started, of experimenting and of failure. That is, a small—or large—entity can formulate a plan and try it out quickly. Today’s communication tools—free or low cost, Open Source, accessible to use—make a huge difference. Without substituting for organization, they enable those of limited organizational means to be lighter on their feet.  
 
I’m sure that the advent of the printing press, transcontinental rail, telegraph or telephone had little to do with encouraging dissent or advocacy and everything to do with expanding business. Yet in their own way and in their day, they certainly also reduced the cost of coordinating the grass roots at a distance. Where Gladwell emphasizes the problems of weak ties and limited commitments, we can also see the ability of more people to make a start and try things out. What happens after that is where learning, strategy, and structure come in. 
 
Among the many responses to Gladwell, I found much to like in Jeremy Brecher and Brendan Smith on Common Dreams. They take on Gladwell’s discussion of strong versus weak ties, hierarchy versus networks: 
 
"As Gladwell indicates, ten thousand people sending each other tweets doth not a revolution make, or even major social change. Whatever else, significant social change requires, as Gandhi put it, "noncooperation" with the status quo and a "matching of forces" with those who would maintain it. Social networking cannot in itself provide either of these. But it can be a powerful tool for making such expressions of power possible. "
 
If the social media offer free or low cost activist tools, nothing stops those with financial resources from entering the game at a higher level.  In this year’s election especially, we can the effects of corporate and well-financed Tea Party efforts at work in the new media. This brings me back to the Frank Rich’s NYT 10/9 column I mentioned earlier.  Rich is unfortunately right that “The Internet in general and social networking in particular have done little, if anything, to hobble those pursuing power with such traditional means as big lies and big money.” In other words, if the social media are accessible adjuncts to grass roots activism, they also can be subject to mastery by well-funded conservative defenders of the status quo. Rich directs his wonderful ire at those who would hide behind social media efforts to avoid the public or distort public discourse. 
 
What’s coming very quickly are sophisticated tools intended for corporations to monitor and attempt to out organize social media critics. Salesforce, for example, already offers tremendous  tools for monitoring Facebook or Twitter. Absolutely, we see nonprofits evaluating use these tools as well,  but there is no question that large corporations will use them more thoroughly.
 
Overall social media are here to stay as communication tools. If Gladwell’s article helps keep people from having illusions about how far they can reach on their own, great, but let’s get on with making them work for advocacy and the grass roots.
 

Why Net Neutrality Matters

 How good is your Internet access and how much does it matter? How much does improving your Internet access matter for you at home, for your organization at work, and for the constituents and communities you work with? You may be thinking, I finally have broadband (high speed access) at home, at work, and at my kids’ schools, so yeah, there’s something there, but not the biggest issue.  Trying to explain this lately, I have been looking at it as the overlap of three swirling issues—broadband speed and access, net neutrality, and digital divide. 

Broadband Access in the United States

 

I previously posted information on how you can test your Internet connectivity, and thereby contribute to an overall national survey organized by the FCC-- http://idealware.org/blog/test-your-internet-speed.  Given I’m still waiting to get formally added to the study, let me suggest a quicker easier way to test your speed: http://www.speakeasy.net/speedtest/  or http://www.whatismyip.com/tools/internet-speed-test.asp  Note that many sites on the Internet that let you test your access speed try to sell you this or that or install stuff on your desktop. If you are curious, try to just do the test!

In addition to the FCC’s own studies, the Berkman Center for Internet and Society at Harvard Law School also undertook its own grant-funded independent research to advise the FCC.  Read more at http://cyber.law.harvard.edu/research/broadband_review.
 
This research confirms what most of us know: even if you have broadband where you need it, it’s likely far from perfect.  According to the Berkman Center, we in the United States have slower and most costly Internet service than many other industrialized countries. And that’s not even looking at customer service issues. At the end of the twentieth century, the United States stood at the top of the heap on most measures of high speed Internet access and performance. It seemed easy to imagine that corporations in the United States would use that difference to their competitive advantage. Ten years later, we have slipped to the middle of the thirty countries (members of the OECD, http://www.oecd.org) the Berkman Center studied. Other studies put the US in the bottom third.
 
In the Berkman study, the US ranked 15th out of 30 in high speed access to the Internet. While US prices for lower speed access ranked only ninth, the US ranked 18th-19th for cost of higher speed access. The downward trend directly followed from –surprise, surprise-- Bush era FCC  deregulation. In 2001 and 2002, the FCC decided that the Internet was about content and not telecommunications. The FCC regulates telecommunications (notably, phone service) and not information. These decisions effectively by-passed earlier Clinton-era legislation which had aimed to open up Internet access. And the results have been less competition and a steady slippage. 
 
At the margins, we see in our own work that more costly and less reliable access to broadband does affect decision-making about software and new program initiatives. Not every time for sure, but enough to be a visible drag on strategic planning. Businesses and  government agencies as well as nonprofits will tend to lag behind in adopting newer cloud-based web data systems to manage their work. Staff may even prefer really old software because they remain perky on slow connections; even if it means ten extra steps to cross reference different types of information or send simple follow up emails. It means that organizations will spend extra for duplicative back-up data repositories because they remain anxious that web access will slow or break down during critical program or organizing periods.  And for both global corporations and small nonprofits, it may mean postponing or making more complex collaborative initiatives based on exchange of data and web services.   
 
It has become a truism that the Internet has become as critical a part of the national and global infrastructure as traditional transportation and communication. Yet when President Obama recently announced new legislation for a $50 Billion public works project, the presumption is that it would provide jobs and improvements in those traditional areas. We have a ways to go before public works will mean public Internet works.
 

Net Neutrality and Organizational Strategy

 
Broadband access in turn overlaps with a second issue, net neutrality. Like broadband access, it’s worth understanding how net neutrality affects work and planning. Net neutrality means that the Internet works like a roadway, the post office or phone service. Anyone can use them for the same prices and contracts, and where they are regulated, the regulations apply fairly and evenly. OK, so postal regulations subsidize certain kinds of mailings, including by nonprofits on the one hand and glossy catalog mailers on the other. Yet, so far as I know, there is no way for LL Bean to get different delivery guarantees or rates from what Lands End gets. We all take for granted postal service neutrality. 
 
When it comes to the Internet, many factors are at work at limiting net neutrality in the United States.  After several decades of the Internet, we may be coming to a crossroads where the companies that provide the infrastructure for transmitting may no longer offer the same services and pricing to all publishers of information. The opposite of net neutrality would mean that Verizon, say, could charge Netflix a surcharge to deliver multimedia at faster rates than consumers download from the Apple store. Or that the consumer would pay more for access to services who haven’t partnered with their Internet provider. Sounds like chaos? You bet. Over the summer, Verizon and “Don’t be Evil” Google surprised a lot of people by floating just those sort of ideas. They then amended them to say, OK, net neutrality over everything wired (cable and phone lines) but allow companies to make whatever deals they want for delivering content over wireless connections. 
 
Along with long time Internet advocacy groups such as freepress.net (http://www.freepress.net/media_issues/internet) , the Nonprofit Technology Network jumped in on this issue and urged public action in support of net neutrality: http://www.nten.org/blog/2010/08/12/net-neutrality-update-googleverizon-proposal
Check out this site (associated with freepress.net) for a clear youtube video for teaching people about net neutrality
 
 
 
Here too, we may be relieved to just know that compared to 10 years ago, we have access, and things tend to get through, so worry about more pressing issues. It matters or will matter in a lot of ways, even for small organizations.  Heading toward a place where broadband access in the United States expands more through private corporate deals rather than general pressure or regulation will mean that the US will likely continue to fall behind the rest of the world in communications. That is, we don’t want eroding net neutrality to become the cable, DSL and wireless providers response to relative slippage in service. 
 
More directly, if net neutrality goes, it is easy to imagine a transformation where MSNBC, Fox and CNN get to deliver multimedia content about news events at different speeds depending on the content of the message. Eroding net neutrality could mean that Verizon or Comcast could deliver email more slowing from labor organizations or advocacy groups fund-raising to constituencies or otherwise organizing against their policies. Much as I would love to see Fox News go away, it would definitely make me feel queasy if, say, Comcast decided to slow down their multi-media feeds because of political content. 
 
Over time, allowing such discrimination could mean that organizations with less resources and more cutting edge messages would tend to  be less likely to use multi-media in their work than corporations with the budgets to guarantee delivery of their content. The Internet as a communications equalizer for nonprofit and other independent public policy advocacy will erode. I am not saying these things will change like day and night. It is more the marginal trend that will affect careful budgeting of resources.
 

Net Neutrality, Broadband Access and the Digital Divide Today

 

Both these issues about the future of the Internet express one part of the challenge around a perhaps more familiar long term issue, which is the digital divide. Most of us  instinctively consider the impact of the digital divide in planning technology issues. If we launch a new web site, will all our communities have access to it? Two resources:  a solid, contemporary presentation of the digital divide today: http://www.internetforeveryone.org/. And here is where the  revamped digital divide network list serv lives: http://forums.e-democracy.org/groups/inclusion
 
Digital Divide, when first identified as such twenty years ago, referred more to the high cost of computers than access to the Internet. It had immediacy between richer and poorer countries and regions globally. Computers, and the attendant infrastructure of reliable electricity, access to software and such still represent a daunting challenge in many parts of the world. Hardware has become less of a limiting factor. We have initiatives like One Laptop Per Child as well as the general cheapening of computers. We also have the growing prevalence of Internet-capable phones. And open source alternatives to proprietary desktop applications have reduced that barrier as well.  
 
Emerging in its place is the sense that the digital divide overlaps with reliable, inexpensive access to Internet based services.  For organizations planning their technology strategy, it has become more typical to ask if constituents have daily or regular access to email, the web, SMS text than if they have “a computer at home.” This too emphasizes the original point. We all have a significant stake in the debates  over net neutrality and effective broadband policy, here in the United States and globally. 
 

Professional Slide Presentations for Regular People

 I read Garr Reynold’s Presentation Zen (see his blog, http://www.presentationzen.com/)  while getting a workshop ready for last spring’s NTC (http://www.nten.org/ntc).  Read Presentation Zen when you need inspiration in focusing, simplifying and designing your slide presentation so your ideas shine through.


Microsoft Office Powerpoint has become the pervasive means for organizing group presentations. Not just the software, but also the method of speaking and organizing. Yes, I also use Open Office (http://openoffice.org "Impress") and Google Docs Presentations. For preparing a  set of slides quickly, those will do quite well (or Apple Keynote). I especially want to say that all of Open Office 3 now have a solid finish to them, whether on a PC, Mac or Linux computer. Yet like other Microsoft Office products, Powerpoint shines partly based on features, and partly based on high quality fonts, graphics, style sheets and full bore templates. The online MS Office library has good, usable graphics, templates (color, layout and font schemes) and presentation outlines to start from or at least use for comparison.  

Yet this packing of Powerpoint with tons of design aids has put way too powerful of tools in the hands of too many speakers with too few clear ideas and too little graphic design sense.  See http://www.slideshare.net/thecroaker/death-by-powerpoint for a reaction to this trend. And here is the classic 2003 Wired article from Edward Tufte: http://www.wired.com/wired/archive/11.09/ppt2.html

Presentation Zen seemingly arose as an antidote to letting slide show form dominate over speaker message.
At first glance, you wouldn't suspect this of the book, which is rife with full color photos and illustrations. Not quite coffee table-level, but heading in that direction. How can such a book teach anyone simplicity?

If you just want the quick fix, head instead to the sample slide shows on office.microsoft.com afterall. And I still find amusing Guy Kawasaki’s five year old blog post offering the 10/20/30 rule of Powerpoint, http://blog.guykawasaki.com/2005/12/the_102030_rule.html#axzz0tYdu3KcW

Garr Reynolds offers something more substantial. This short book aims to teach you how to think your ideas in ways that have strong aesthetics yet let the message shine through. The visual and multimedia elements complement and call attention to what is important--enabling you to speak in a direct, unencumbered, and authentic voice. If you know that insane animations just drive everyone crazy while slide after slide of tiny bullets put people to sleep, the book will help you find your own ground.
 

For those maybe used to word processing or spreadsheets, he has three framework chapters about how to organize your thoughts specifically for a presentation. Many of us probably start by writing a short essay or lengthy email to presentation collaborators or to one's self. Then try to turn that into presentation slides. Planning and outlining can and should be different to fit the goal (a presentation to a group) and a tool (slide software).  Using presentation software effectively can help turn just reading a speech into a lively, engaging, participatory presentation. This is Reynolds' perspective.

He then walks the reader through practical design lessons and techniques. He has both organizing ideas at the grand scale and also discussions on nitty gritty details. Right down to topics such as whether you need your logo on every slide. He gives practical idea on how to present data in graphs, including when and how to keep audiences awake with complementary images.

Speaking of images, while top draw advertising firms may commission their own photos and such, Reynolds stresses the ability to include high quality yet inexpensive graphics from places such as istockphoto. (I would also mention searching on flickr for creative commons licensed images).  Personally, I have been trying to use more of my own photos in presentations. Not that I’m a great photographer. It has become part of the sifting and reflection process--and putting the message first-- to find or take my own photos.

A third section of the book takes on bringing your slides to life in the actual presentation. Slides are not meant to be emailed and read like a report. They are meant to be delivered, whether in person or via on-line workshop. His suggestions here also blend the overall technique with practical suggestions about length, logistics and lighting.

Overall, the book has lots of good stuff, whether painfully learned reminders or new to you, and I recommend it. Here's to not turning powerpoint into a verb while keeping alive what is useful in this tool.

 

Some useful random resources I would mention (I'm sure you have many more):
Reynold’s blog, http://www.presentationzen.com/
A presentation by Reynolds on presentations: http://www.youtube.com/watch?v=DZ2vtQCESpk
Sacha Chua’s 7 tips on remote presentations: http://sachachua.com/blog/2009/10/7-tips-for-remote-presentations-that-rock/
Slideshop.com: for reasonably priced Powerpoint templates that fit a specific need. (Yes, some of these could get you back into overdo it land, but that’s why you need Reynolds.)
And, Confessions of a Public Speaker, Scott Berkun’s (www.scottberkun.com)  great little book, which I have already reviewed on this blog earlier.
 
And please note: this is part 1 of a 2 part series on the tools and resources for effective presentations. Next, Debra Askanase will take on slide:ology: The Art and Science of Creating Great Presentations, by Nancy Duarte.

 

Syndicate content