Mashup Patterns: Designs and Examples for the Modern Enterprise
Chapter 1, Understanding Mashup Patterns
In this chapter excerpt you'll learn about the history of the Web and the emergence of Web 2.0 and its components, particularly mashups. Find out about Enterprise 2.0, RSS and Atom formats and discover the meaning of the term "mashup."
When the World Wide Web was first unveiled, "collaborators" referred to one small segment of the population: nerds. The first browser ran on a computer that almost no one outside of a university or research lab used. The Web itself consisted of a lone site (WWW Growth, Figure 1.1). Yet from this singularity, a new universe would soon emerge.
The amount of content didn't grow much until two years later. That was when the first of several "Big Bangs" would occur. In 1993, the first PC-based program capable of browsing the Web was released. Its introduction instantly put the Web within the reach of a far larger audience. Even so, Internet connectivity remained largely restricted to universities, research institutes, and corporations. Consumers enjoyed online communities, but generally did so via prepackaged, fenced-in services such as Compuserve, Prodigy, and America Online (AOL). Connectivity was achieved through slow "dial-up" connections over telephone lines. Access to content was typically billed at an hourly rate.
By 1994, the first independent Internet service providers (ISPs) had begun to pop up. By installing special software on their computers, consumers could access the entire content of the Web (almost 1,000 sites!). AOL began to open up Web access for its millions of subscribers. Prices universally moved to flatrate monthly charges. WYSIWYG ("What you see is what you get") HTML editors appeared and made creating Web pages just a bit easier. In response, the second explosion in Web growth occurred. By 1996, corporations didn't see a Web presence as a luxury, but rather as a necessity. What better way to instantly push content to the consumer? The Web was viewed as a new media channel that offered endless opportunities for commercial success.
If the waning years of the past century had a motto, it certainly wasn't "Collaborators welcome"; "Venture capital welcome" is probably more accurate. Fueled by ill-conceived business plans and wild speculation, a worldwide expansion of the Web's underlying infrastructure took place. Meanwhile, the browser jumped from home computers to cell phones and mobile devices for the first time. High-speed cable and DSL "broadband" connectivity options became ubiquitous. The third explosion was the popping of the Web bubble, which saw these ventures implode en masse when they failed to turn a profit. This event marked the end of the first wave of the Web's evolution, which in hindsight we label Web 1.0.
In the aftermath of the Web 1.0 crash, the glut of infrastructure kept the costs of going online low. That simple fact helped attract even more users to come online. A few companies began to figure out how to leverage the Web without going bankrupt. Collectively, their embrace of the Internet represented the slow expansion of the Web from that last primordial blast. New marketplaces evolved as sites like eBay linked buyers and sellers from around the globe. These online flea markets, in turn, spawned communities that helped pioneer the concepts behind new social networking sites like MySpace and Facebook.
This chapter is an excerpt from the new book, "Mashup Patterns: Designs and Examples for the Modern Enterprise" authored by Michael Ogrinz, published by Addison-Wesley Professional, ISBN 032157947x. For more information, please visit informit.com or mashuppatterns.com. Safari Books Online subscribers can access the book here.
By 2006, the firms that had simultaneously feared and tried to control Web 1.0 looked up from licking their wounds and saw the dawn of a new paradigm. In a symbolic changing of the guard, "old media" giant Time magazine announced the Person of the Year was "You." There was no great single occurrence that made this milestone possible. Rather, the driving force was the confluence of many events: the spread of cheap broadband access, the Web enabling of multiple devices, the arrival of new communication environments, and the emergence of cooperative environments for organizing information. Collaborators were finally running the show.
Industry figurehead Tim O'Reilly is credited with popularizing the term "Web 2.0" to define this new age:
Web 2.0is the business revolution in the computer industry caused by the move to the Internet as platform, and an attempt to understand the rules for success on that new platform.
A simpler working definition is that Web 2.0 is a shift from transaction based Web pages to interaction-based ones. This is how the power of "You" is mashed, mixed, and multiplied to create value. Social-networking sites, folksonomies (collaborative tagging, social bookmarking), wikis, blogs, and mashups are just some of the components that make this possible. The success of sites such as Facebook, wikipedia, flikr, and digg has demonstrated that democratization of content creation and manipulation is powering the latest wave of Internet growth.
The underlying driver of Web 2.0 is flexibility. The one trait technologies slapped with the Web 2.0 moniker share is that they are extremely (and perhaps sometimes unintentionally) malleable. The successful products don't break when a user tries to extend them beyond their original design; they bend to accept new uses. Two success stories of the new Web illustrate this principle:
flickr was started by Caterina Fake and Stewart Butterfield as an add-on feature for a video game they were developing. The idea was to allow players to save and share photos during gameplay. When they realized that bloggers needed a convenient way to store and share photos, Fake and Butterfield started adding blog-friendly features.
Opening up their architecture to allow users of the site to create custom enhancements fueled their viral spread. The original game was ultimately shelved and flickr was sold to Yahoo! a year later for an undisclosed sum.
Deli.cio.us grew from a simple text file that its founder, Joshua Schachter, used to keep track of his personal collection of tens of thousands of Web site links. When the site went public in 2003, it spawned a host of add-ons. The concept of associating data with simple keywords to aid in organization wasn't new, but the cooperative "social tagging" aspect of deli.cio.us resonated with the frustrations of other Internet users.
Inevitably, when people discover a useful tool outside the workplace, they want to use it at the office as well. This happened years earlier when employees began sneaking personal computers into their offices to make it easier to manage spreadsheets and documents. More recently, end users have imported instant messaging and unlimited email services from external sources.
User demand for Web 2.0 technologies within existing corporate infrastructure is the catalyst for Enterprise 2.0. The challenge for firms is to integrate these new peer-based collaboration models with legacy technologies and mindsets. Figure 1.2 illustrates three areas that established organizations have typically established to control how solutions are delivered.
Enterprise 2.0 breaks down traditional divisional barriers and encourages building bridges. The managerial structure does not change, but the ability to conceive solutions and access the technology to deliver them is available to everyone (as shown in Figure 1.3).
Changing the social structure of a firm is termed "soft reorganization." Its consequence is movement away from fixed roles and responsibilities and toward a more open and unrestricted workplace. The phrase "economies of scale" refers to the cost advantages associated with large-scale production. We term the benefits of Enterprise 2.0 the "economies of collaboration." How are they established?
- Nontechnical users are empowered to create application solutions without engaging management or IT personnel in the process. This agility leads to shorter time-to-market cycles.
- Folksonomies replace strict taxonomies (see the "Folksonomies versus Taxonomies" sidebar). Newly discovered connections between data and processes can be exploited to add business value.
New communication tools mine "the wisdom of the crowd" to encourage collaboration and innovation, a technique known as crowdsourcing (see the "Crowdsourcing" sidebar).
Open interaction can help teams discover how the other lines of business operate. This knowledge, in turn, leads to changes that strengthen relationships across departments.
- IT must learn more about the business associates' goals, and create an environment that facilitates the rapid construction of products that they require.
- Members of the business team must participate more directly in the engineering process (either on their own or in partnership with IT), which requires some knowledge about development best practices.
Management needs to cede some control to other teams and should work with all associates to encourage collaboration. This may entail:
- Funding the necessary infrastructure.
- Allowing cross-pollination between business teams.
- Being open to ideas from nontraditional sources.
Security becomes a universal concern as the lines between teams vanish. The former "checks and balances" approach doesn't work when small teams are creating end-to-end solutions. In this collaborative milieu, firms have to strike a balance between technical controls10 and education to mitigate risk.
Folksonomies versus taxonomies
Taxonomies describe the organization of data within a strict hierarchy. In the business world, they are typically artifacts of established corporate structures. The managerial chain of command establishes processes for the composition, categorization, and flow of information. The structure of a rigid taxonomy may be nonintuitive to outsiders and consequently may restrict the sharing of useful information across the firm.
In a folksonomy, the community takes responsibility for collectively classifying and organizing information through a process known as "tagging." Tagging simply entails labeling content with a few relevant keywords that describe the information or the ways in which it can be used. As more reviewers add and refine tags, it becomes easier to locate and navigate large amounts of information. The process of tagging creates a dynamic knowledge base of material that is not constrained by conventional organizational techniques.
With crowdsourcing, a problem is framed so that it can be tackled by multiple teams or individuals, working either competitively or as a group effort. User-driven mashups can facilitate this type of mass collaboration in the enterprise, thereby resulting in far more resources contributing to solutions besides traditional IT.
A danger of this approach is that a "herd mentality" might develop that stifles creativity. Some degree of oversight can offset this risk, but care must be taken not to discourage participation.
Crowdsourcing success stories include the Ansari X-Prize, which was designed to encourage low-cost space travel, and Wikipedia, which benefits from the combined contributions of thousands of users.
The birth of mashups
Quick, easy, and affordable application development has always been a goal of software engineering. Reusing something that's already been built, tested, and paid for is one of the quickest ways to achieve this objective. From subroutines, to external libraries, to object orientation, to templates, to Web Services, each great advance in programming has been born from the desire to reuse material instead of starting from scratch. The limitation inherent in each of these milestones is that they were created by developers for the sole use by others in their profession.
It seemed inevitable that with the vast amount of new material being placed on the Web 2.0 every second, it could somehow evolve into raw material for software development. Tim Berners-Lee envisioned this leap in Web reusability in what he termed "the semantic Web," which describes a platform for the universal exchange of data, knowledge, and meaning. And while work continues to define new languages and protocols to realize Sir Tim's dream, mashups are making this vision a reality now.
Mashups are an empowering technology. In the past, resources had to be designed for reuse. Application program interfaces (APIs) had to be created, packages compiled, documentation written. The application developers and solution architects who recycled resources were subject to the whims of the original designers. With mashups, you aren't limited to reusing an existing API; you can impose your own if none exists. So if an application or site offers no API, or if you don't like the access methods that are already in place, you can design and implement your own (see the API Enabler pattern in Chapter 4 for several examples). The promise of achieving programmatic access to almost unlimited data is intoxicating. Even more exciting is the notion that the tools for constructing mashups have begun to reach a level of usability where even nontechnical users can build their own solutions.
Many popular definitions of a mashup would have you believe the term is limited to a combination of Web-based artifacts: published APIs, RSS/Atom feeds (see the "RSS and Atom" sidebar), and HTML "screen scraping." Although there are certainly valuable solutions in that space, a broader world of data can be mashed up, including databases, binary formats (such as Excel and PDF), XML, delimited text files, and more. The rush of vendors attempting to capitalize on the burgeoning market for enterprise solutions hasn't helped bring clarity to the field. To turn a classic phrase on its head, we have a ton of nails out there, and everyone is trying to tell us that they have the best hammer.
RSS and ATOM
RSS (also known as Rich Site Syndication or Real Simple Syndication) and Atom are formats for publishing Web-based content in a manner consumable by special applications termed "feed readers." Feed readers aggregate multiple feeds (or "subscriptions") so that a user can view updates to numerous Web pages from a single environment.
Before RSS and ATOM existed, users had to manually visit each site and check for any new updates. Feeds also serve as a popular method for allowing Web sites to dynamically incorporate content from external information providers. Regardless of their originally intended purpose, because feeds are created using a well-structured format (XML), mashups can easily consume them as a data source.
Another common misconception is that mashups combine at least two disparate sites to form a brand-new "composite" application, complete with a neat new user interface. That's certainly possible, but mashups need not be an end in themselves. It is more accurate to say that all composite applications are mashups, but not all mashups are composite applications. The enterprise mashup creator can use the technology to transform the Web into his or her own private information source. This data can be used for strategic planning or analysis in systems like Excel or MATLAB. Mashups may also be used to access a single resource at superhuman levels to mine data or migrate content. Creating mashups is all about finding data, functionality, and services and using them to both solve problems and create opportunities.
Continue to the next section: Consumer and enterprise mashups
Download Chapter 1, Understanding Mashup Patterns
Read other excerpts and download more sample chapters from our CRM and call center bookshelf