Sunday, May 24, 2009

Concepts Assignment

Concept 28


“Advanced Internet users recognise the character of the Web, seek to utilise its advantages, ameliorate its deficiencies and understand that not all users have the same abilities as themselves in reconciling the paradox of the WWW.” (Allen n.d.)


Cloud computing seeks to take advantage of the paradox facing new users of computers and the WWW. Namely to make tasks and applications appear easy, when at a deeper level they are complex, requiring skill and knowledge.

What is cloud computing anyway? The ‘cloud’ refers to data centres that sell excess storage and computing power to individuals, businesses and governments to amortize the cost of setting up and running said data centre. Many users already avail themselves of these services without realizing the underlying cloud component and paying via their viewing of placed advertisements. Some of the common ‘clouds’ include, email (Gmail, Yahoo!, and Hotmail), search (Google, Yahoo! And MSN), applications (Google docs, Mobile Me, Evernote), storage/distribution (Flickr, YouTube) and communities (Facebook, MySpace) . In fact according to a recent survey by Pew Internet and the American Life project, 69% of users are already taking advantage of the ‘clouds’ (Horrigan 2008).

In the tasks undertaken in this unit, the ‘cloud’ most accessed was Google, and this ‘cloud’ made the initial searching required very easy, belying the complex functions being undertaken on this users behalf by the ‘cloud’. As the unit progressed and advanced searching techniques were introduced, it became apparent that the use of the ‘cloud’, whilst beneficial initially, had encouraged a lazy use and less understanding of the underlying concepts underpinning the WWW.

However an advanced user, aware of the ‘cloud’ and its implications, would be able to use the ‘cloud’ services where appropriate (or convenient) and be aware of the downside and other more complex but better tools and resources available. Advanced users also have a social responsibility to signpost the way and help educate novice users about the implications of always taking the “easy” way. Educating and assisting novice users to partake of the deeper web and look for diversity in resources help to advance the Web by way of diversity and decentralization.

So are the gathering ‘clouds’, good or bad in relation to the inherent nature of the Web, being decentralized, innovative and complex? I would argue that in relation to innovation, diversity and decentralisation, ‘clouds’ are inherently bad, as first and foremost, they centralise data and applications, forcing users into proprietary systems. Innovation is stifled as users do not have to deal with information in a variety of formats. It also subverts the freedom of the internet by raising the possibility of corporations or governments having increased control over the data stored and manipulated by these data centres.

However these bad outcomes may be offset or canceled out altogether by the benefits to universities and business being able to effect better outcomes due to the economies delivered by the ‘clouds’. If the bigger players lay down an ethical and fair playing field then the providers may resist the temptation to ‘control’ the data they store, or to push users into ever more expensive proprietary systems.

‘Clouds’ also offer the major benefit of mobility of data and browser based applications, making using the Web easier and more accessible. The trade off, of more new users that contribute to the web, versus the homogenisation and centralisation of the Web, illustrates another Web paradox.

Annotated Sites:

Site 1: University of California, Berkeley. Reliable Distributed Systems Laboratory. http://radlab.cs.berkeley.edu/

Cutting edge use of cloud computing. This site is from a major U.S. university with backing from internet heavyweights, Google, Microsoft and Sun Microsystems. Their mission statement reads something like, “We want to enable an individual to invent and run the next big IT service.”

Although not a reference site I felt it was important because of their mission statement. Because the ‘cloud’ will centralize data and services, I think it is the universities especially that will provide 'cloud' service providers and users with examples of an ethical approach and one that promotes diversity and innovation. So even if these guys are promoting their ‘cloud’ vision, they are also enabling individuals to provide new and unique services.


Site 2: First Monday. http://www.uic.edu/htbin/cgiwrap/bin/ojs/index.php/fm/index

Peer reviewed internet journal. Clean and clear layout and navigation, with open access for readers and contributors. This site is the future of journals if we are to follow a model of social value vs market value in disseminating academic information. Also has a great tool in allowing references to be imported straight into Endnote. Being peer reviewed, the papers are of high quality. First Monday also allows access to their latest papers free of charge. The website is provided by the University of Illinois and allows users to register for other journals within the university.

References:

Allen, M. (n.d.). "Internet Communications Concepts Document." Net 11 Internet Communications Retrieved 20 May, 2009, from http://lms.curtin.edu.au/webapps/portal/.

Horrigan, J. (2008). "Use of cloud computing applications and services." Retrieved 20 May, 2009, from http://pewinternet.org/Reports/2008/Use-of-Cloud-Computing-Applications-and-Services.aspx.

Additional sources:

Jaeger, P., J. Lin, et al. (2009, 10 May 2009). "Where is the cloud? Geography, economics, environment, and jurisdiction in cloud computing." First Monday Retrieved 20 May, 2009, from http://www.uic.edu/htbin/cgiwrap/bin/ojs/index.php/fm/article/viewArticle/2456/2171.




Concept 11

“Advanced Internet users learn to intuitively conceive of any document, file, message or communication as consisting of metadata and data. They then can explore the functions of various communications/information software looking for how that software can assist them in using metadata to enable sorting, processing or otherwise dealing with that data.” (Allen n.d.)

With the mind-bending amount of data on the web, a system to describe the data in a container extensively is required. This is the job of metadata, literally data about data. Like the familiar Dewey Decimal Classification used in libraries, the purpose of metadata is to help locate appropriate resources. However metadata goes further than conventional systems by allowing for a lot more information to be connected to a resource. The Dublin Core system allows for fifteen metadata elements that describe in three groups the resource’s; content, intellectual property and instantiation.


At its most basic level shareable metadata should be human understandable. (Shreeves, Riley et al. 2006). While completing the Net11 unit tasks and preparing to do the referencing for the concepts assignment, del.icio.us proved to be a great tool. Through the use of its keyword tagging and notes metadata, bookmarks are more easily searched and put into context. This not only speeds up referencing, but provides ‘breadcrumbs’ to bring a mind back to where it was when creating the bookmark.


We are seeing an evolution from the current Web of documents towards a Web of linked data and the broad benefits this brings. (Hall, De Roure et al. 2009) To facilitate this new Semantic web, we need intelligent agents and bots to aggregate the metadata and this will require the big data collections to be more careful with their use of metadata. Rather than stuffing a single record in say the Dublin core schema with all the information relating to a piece of data, several more individually sensible records should be created. This is with the aim of making the most of the data aggregator bots that create metadata collections.


Carl Lagoze has argued that “metadata is not monolithic ... it is helpful to think of metadata as multiple views that can be projected from a single information object” (Lagoze, 2001). In photography there is an increasing use of metadata in editing images, not just defining them. When you edit an image in its Raw state, you change the attributes of its associated xml file but leave the image data untouched. This means you have one base file and then another file that act like a filter that you perceive the image through, allowing totally non destructive editing, but perhaps more importantly means a big saving in library size. Will we have a web of base files with intelligent filters that we view this base information through, then edit and append to suit our purposes without editing the base file?


Site 1: http://dublincore.org/

This is the website of the Dublin Core initiative, a not for profit organisation, helping develop open standards for interoperable metadata. A great resource site for all things metadata. There are tools for adding metadata to web pages, a firefox plugin for viewing embedded metadata in HTML and XHTML documents. There is plenty of information about current schemas and upcoming proposals.


Site 2: http://www.oaister.org/about.html

This site is the result of the move to a more semantic web. It uses the Open Archives Initiative Protocol for Metadata Harvesting to find data in the deep web, hidden behind scripts in institutional databases. As more organizations come on board, this will become a very powerful resource, maybe Oaister is the new Google.

References

Dublin Core (2009)

Retrieved May 20, 2009 from http://dublincore.org/

Hall, W., D. De Roure, et al. (2009). "The evolution of the Web and implications for eResearch." Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences 367(1890): 991-1001.

Lagoze, Carl 2001. “Keeping Dublin Core Simple: Cross Domain Discovery or Resource Description?” D–Lib Magazine, volume 7, number 1 (January), at http://dlib.anu.edu.au/dlib/january01/lagoze/01lagoze.html, accessed 18 May 2009

Oaister (2009)

Retrieved May 20, 2009

Shreeves, S., J. Riley, et al. (2006) "Moving towards shareable metadata." First Monday Retrieved 18 April 2009, from http://www.uic.edu/htbin/cgiwrap/bin/ojs/index.php/fm/article/view/1386/1304




Concept 8

“The daily practice of electronic communication is shaped by over-familiarity with one's own computer system, and a tendency to assume that – as with much more established forms of communication – everyone is operating within compatible and similar systems. When in doubt, seek to communicate in ways that are readable and effective for all users, regardless of their particular systems.” (Allen, n.d.)

Web design should, in theory, be about making a site as accessible and useful to as many people as possible. Countless dollars and time are spent on corporate websites meant to improve the publics awareness of a brand yet frequently these site are not accessible to people who are disabled. Designers and administrators have many tools and resources available to them to enable their sites to comply with access guidelines, yet an animated splash page seems to be more important than reaching there full potential market.


A diverse range of people of all ages and ability access the web. Levels of ability include physical, mental, education, available technology and socio-economic factors. Designers and administrators should be conscious of these differing levels of ability and take these possibilities into account when planning and executing web deployments. “The power of the Web is in its universality. Access by everyone regardless of disability is an essential aspect” (Berners-Lee, 1997)

The BBC’s Ouch! Website is an exemplary example of site design that looks good, uses multimedia, but jumps through all the accessibility hoops. It has options for changing font size and colours site wide (using cookies and css) so you don’t destroy the layout, there is a text only version for Lynx browsers and you can navigate it without a mouse, simple stuff but recent studies point out that large percentages of sites, (70-98%, depending on the category of site) are not accessible. (Lazar et al,. 2004)

Accessibility of an organisation’s website, not only for the physically and mentally disabled, but for the elderly and uneducated, improves the brand. It also improves the overall internet experience when it is standard practice, because for a site to be fully accessible and complying with the W3C/WAI standards, it employs strict xhtml and css. If for this reason alone it would be good practice for all web designers to strive for full accessibility in the sites they design.


Web 2.0 leader Facebook is actively working with the American Foundation for the Blind (AFB) to improve it social networking service, as according to the AFB’s statistics, there are 20 million Americans who have reported significant vision loss. (Wauters, 2009) The other Web 2.0 (and possibly the Semantic Web) leader Google, is also contributing with research and implementation of accessible design for its basic search engine and other services.

Site 1:

http://www.w3.org/WAI/

This is the W3C’s initiative to develop guidelines for Web accessibility, as well as provide support materials and resources to help understand and implement Web accessibility. Extensive information about the existing standards and the working papers for the standards for tomorrow as well as links to all the important resources neede to implement accessible web design.

Site 2:

http://www.bbc.co.uk/ouch/

One of the best website designed for accessibility that’s looks and functions exactly as we would expect a “normal” site to function. Proves the point that you do not have to sacrifice design to achieve accessibility. Uses multimedia including podcasts and images, blogs and message boards, but has alternatives far users who need others options to use the site and access its information.

References:

Berners-Lee, Tim (1997) on the launch of the International Program Office for Web Accessibility Initiative - http://www.w3.org/Press/IPO-announce

Lazar, Jonathon., Dudley-Sponaugle, Alfreda., Greenidge, Kisha-Dawn., (2004). Improving web accessibility: a study of webmaster perceptions. Computers in Human Behaviour 20, pages 269-288. Retrieved May 18, 2009 from www.elsevier.com/locate/comphumbeh

Wauters, Robin. (2009, April 7). Facebook commits to Making Social Networking More Accessible for Visually Challenged Users. Retrieved May 20, 2009, from http://www.techcrunch.com/2009/04/07/facebook-commits-to-making-social-networking-more-accessible-for-visually-challenged-users/




Concept 24

File transfer protocol remains the best example of how the Internet enables files to be sent to and from clients, at their initiation, thus emphasising the local autonomy of the individual user, and the arrangement of ‘cyberspace’ into publicly accessible and changeable regions.” (Allen, n.d.)

P2P is conceivably one of the main factors of broadband take up in Australia. As a society that once dreamt of egalitarian ideals, the idea of “sharing” is readily defensible. Australians are amongst the most prolific down loaders of illegal content in the world. Total visits by Australians to BitTorrent websites including Mininova, The Pirate Bay, isoHunt, TorrentReactor and Torrentz grew from 785,000 in April last year to 1,049,000 in April this year, Nielsen says. This is a year-on-year increase of 33.6 per cent. (Moses, 2009) Music, movies, games, warez (software) all reside on suburban computers. Is this a rebellion or opportunism?

Fair use models are replacing draconian models of copyright protection around the globe. Not because of corporate philanthropy, more from corporate necessity. In academia as well, the offering of full text under various schemes to promote interchange of information for various reasons, have altered irrevocably the landscape of intellectual property. Meta harvesting and the Semantic Web will bring more pressure to bear on existing models of copyright and distribution of intellectual property, as institutions have to free up more data on open models to compete with other institutions doing the same thing to grab market share.

Social value vs market value is the consideration for us all. The growth of social networking and user generated content, implies that people are willing to create and share. They experience content and they create for others, without expectation of payment. So rather than creating for fiscal benefit, these contributions come more from an uncontrollable desire in some people to “share” their experiences and thoughts.

FTP has been historically a one way street of downloading, Web 2.0 plus an overall maturing of the users has led to a two way street and less emphasis on the commercial profits to be had from the Web. Are we moving away from a web of “taking / buying” to a web of “sharing” as we realise that if we co-operate more and profit less in the short term, we all profit in the long term.

Site 1
http://www.journals.uchicago.edu/doi/abs/10.1086/503518

Very well thought out legal position on the ongoing problem of music piracy and its perceived damage to the RIAA.

Site 2

http://www.ethics.org.au/

Plenty of resources here to explore the ethical world. Confronting personal knee jerk reactions with a measured ethical approach will benefit us all.

References:

Moses, Asher.,Illegal downloads soar as hard times bite. Retrieved May 22, 2009, from http://www.smh.com.au/news/home/technology/illegal-downloads-soar-as-hard-times-bite/2009/05/27/1243103577467.html