LGBTQ Online Identities


Communities through time have given people a sense of purpose and identify (within pre-modernity) and have allowed people to experiment, adapt and explore their identity (modernity and post modernity). Cyberspace allows communities to transcend the geographical locations which for a long time where the driving forces of community definition. This form of community building has been heavily used by many sects of modern society; from families living across the world, political movements, and hacking communities such as ‘Anonymous’. The LGBTQ (Lesbian, Gay, Bi-Sexual, Trans, Queer/Questioning) community has embraced this form of community building and communication technologies for a wide number of reasons. Alternate sexuality’s over time have been victimized and persecuted, with sexuality also being an invisible factor of a human identity that by moving it online it can associate with a profile making it visible, removing it from the judgmental views of society.

This report looks into the theory that grounds the creation of communities online and how they influence the formation of individuals’ identities on and off line.  Looking into specific cases that are set within the LGBTQ field, drawing on the community, and self-narrative predominately within the youth demographic.

Grounding Theory

The three academic feilds of “Queer Theory”, “Cyber Studies” and the intersection of the two “Cyber Queer Studies” are the bases for the grounding for the understanding of Queer Online Communities.

Queer Theory

Queer over the years has been reclaimed by the LGBTQ academia, from a term which was seen as homophobic and derogatory too a word which is a umbrella term for sexualities “coalition of culturally marginal sexual self-identification” and a “nascent theoretical model which has developed out of more traditional lesbian and gay studies”(Jagose, 1997a). It builds on feminist ideas that gender is essential to self and that the social constructed nature of sexuality that came out of lesbian and gay studies (Jagose, 1997b).

Cyber(culture) Studies

Cyber studies was kick-started in the 1990’s treating the social and cultural dimensions of the Internet as a distinct and relevant topic. Early work within the field identified two main research areas looking into management of identities online (Bruckman, 1992), and using the internet for community building (Turkle, 1997b).

Cyber Queer

Cyber Queer is the joining of Cyber Studies and Queer Theory.

“The “cybersubject” appeared to be the ultimate manifestation of queer theory, as it was seen to transcend the physical world in a parallel space, where it freely and flexibly could pick and choose who to be.”(Tudor, 2012)

“All the World, in feminist and Queery Theory, it would seem, is no longer a stage, but a screen.”  (Case, 1995).

This looks specifically at the intersection of LGBTQ issues and the online community. As the quotes say it allows for the separation of the body and mind, allowing for an interaction with sexuality and identity as never seen before.

Cyber Queer Studies can be broken down into four areas of research (Wakeford, 2002):

  • Identity and presentation online
  • Queer virtual spaces
  • Electronic facilitation of social network online and virtual communities
  • Potential of new technology to transform erotic practice

These points will be touch on through out this report.

Sense of a Community

The factors effecting the formation of communities have been broken down into two components; the first being geographical and the second being interest based. Geographical can be a town, city or street where as interest can be political, social or creative interests. These are not mutually exclusive though, and as society develops there will be a move for community formation around interests not geography, brought on by the progression of technology (McMillan and Chavis, 1986). From this a community can be defined by four factors:

  • Membership – The group membership is defined by the boundaries of the group, which are established by the deviants of an existing group. Barriers too entry such as language, dress or ritual can also define the membership. For individuals’ membership benefits such as emotion safety, allowing for personal investment, through belonging (McMillan and Chavis, 1986).
  • Influence – Validates membership of a group, giving meaning to the belonging, as the individual can see they are making a difference. This is a bidirectional concept, as the group has influence over the members, and that members have influence over the group (McMillan and Chavis, 1986).
  • Integration and fulfillment of needs – The feeling by the individual that the resources received from the group will fulfill their needs, thus enforcing the positive sense of togetherness which is needed to maintain the group. Individuals are attracted to groups whose skills they feel will most benefit them; this has been called person-environment fit (McMillan and Chavis, 1986).
  • Shared Emotional Connection – This is the shared narrative in the group; common stories, relationships, and history. They do not have to have participated in it, it is the ability to share and communicate it (McMillan and Chavis, 1986).

This can be summed up by:

“Sense of community is a feeling that members have of belonging, a feeling that members matter to one another and to the group, and a shared faith that members’ needs will be met through their commitment to be together” (McMillan, 1976)

SIT (Social Identity Theory)

Sense of a community defines the reasons for communities on a high level, however SIT (Social Identity Theory) attempts to explain the way groups form, the communication and formation of behavior, trust, solidarity, in group (intra-group) communication and, communication and discrimination with external group (inter-group). The states which build the groups can be broken down into for distinct stages/sections (Trepte, 2006).

Social Categorization

This come from the need to process large quantities of information, thus to reduces the amount individuals group people by common characteristics; thus categorizing individuals into groups makes understanding the social environment more manageable (Tajfel, 1979).

This then leads to the use of inter and intra group differences being used for the formation of these identities. With inter being the differences between groups, and intra being the difference between members. When all members of a group conform to the same social categorization it allows for the groups to function according to stereotypes, allowing them to give reason for behavior, explain and give sense to the group. However this allows for social stereotyping from external groups through inter group’s communications, categorization and discrimination (Tajfel, 1982).

It is not just external people who categorize, individuals categorize themselves into groups, allowing from them to gain a self-identity from the group identity. Accessibility to a groups’ membership is determined by current social and/or emotional significance it has to the individual; this is formed by assessing the similarities and differences of the members of the group and self (Oakes et al., 1991). This bears similarities to group membership as mentioned before.

Social Comparison

This is where social categorization is taken by an individual or a group and is used to compare themselves with external individuals or groups. To gain an understanding the importance of the groups which individuals belong to, comparison to other groups is used, especially where there is little justification of membership of a group. This ultimately determines the building of social identity and self esteem. Social comparison is essence is the result of comparing internal and external group membership, however the outer group must bare similarities allowing for a comparison (Tajfel, 1982).

Social Identity

This has been defined as “that part of an individual’s self-concept which derives from his knowledge of his membership of a social group (or groups) together with the value and emotional significance attached to the membership” (Tajifel, 1979). This comes from the self-assessment of the positive in-group dimensions and relevant out-group comparison thus allowing for a formation of a positive social identity. However as social groups and comparison are fluid this then means that social identity is in content flux and change, making it negotiable (Trepte, 2006).

Self Esteem

Self-esteem has many varying forms of definitions depending on the context it is used within. In SIT the nature of how it is achieved is defined a number of times; a prevailing topic is the ability to maintain a positive social identity. With self-esteem being seen as the “motivation underlying inter-group behavior” (Trepte, 2006), meaning individuals try and conformed there identity to that of the group to gain a positive social identity. However in normal SIT theory self-esteem is not mentioned as it is considered a foundation for the whole formation of the theory (Trepte, 2006).

Identity and Modernity

The final grounding theory for identity, with the previous two describing how and why individuals form groups. However this will attempt to ground why people need identities.


Within pre-modernity identity was a given and fixed, there was no need to change it. Dissensions and changes where made/governed by the institutions of the time e.g. Church, Government or Monarchy; this does not mean all dissensions where removed from peoples lives, however the tradition and culture established an order of life which was followed (Baumeister, 1986). An order of priorities it was one of the last on a list priorities of existence, with survival and reproduction top of the list (Hermannsdóttir, n.d.).


The movement to modernity can be has been understood through the examination of 6 characteristics which are associated with modernity (Giddens, 1991):

  • Industrialism
  • Capitalism
  • Institutions of Surveillance
  • The era of total war
  • The rise of the organization
  • Dynamism

Through these 6 forces society has seen a fundamental and dramatic change; where all components of society, and inhabits. Not just affecting how each communicates but how each justifies its existence within society. Dynamism is one of the key points for self-identity formation as noted in SIT theory; this contains (Hermannsdóttir, n.d.):

  • Separation of time and space –in the past time would have been locally defined changing relative to space. But with the adoption of a unified time systems globally it has separated from time from space. This then allows for social relation and communication through time-space around global systems (Giddens, 1991). This builds on the understanding the community’s will break the geographical constrains which they formed around before.
  • Reflexivity – this allows for the change of social activity, identity or knowledge in response to newly gained or formed knowledge (Beck, 1992)

These all formed modernity, which in turn formed individualism. However for modernity to continue an individual must separate from the structures controlling them, allowing them to adapt/create modernity which allows for opportunities, risks, and contradictions (Beck, 1992). Thus people have a range of questions and choices with no guidance aiding the decision making, which in turn makes people look inwards making themselves the center of their universe (Beck and Beck-Gernsheim, 1995). All this breaks down the constrains people had to class, gender and family, turning it more into a status based life style thus making people have ego-centric world view point, believing they can control their own world around them, making them self reflective (Beck, 1992). This means people have to make decisions which in the past they could make dissensions, and as there are many potentially to be made for defining an individual path. Due to this there is an increased presence of the questions “Who am I?” and “How shall I live?” (Giddens, 1991; Hermannsdóttir, n.d.).

However all this leads to increased levels of risk though identities being reflexive. This has lead to the definition of ‘risk society’ (Beck, 1992)

“[l]iving in the ‘risk society’ means living with a calculative attitude to the open possibilities of action, positive and negative, with which, as individuals and globally, we are confronted in a continuous way in our contemporary social existence” (Giddens, 1991).

The concept of social control takes a greater hold, this is the length at which people try to control and correct their social and natural worlds. This can come in the form of plastic surgery, to couples getting a divorce to correct issues in there union. Death is the point at which a person ultimately looses control (Mellor and Shilling, 1993). Social control of self-identity can be broken down to three points;

  1. Surveillance – This is a factor of modernity where there is an increased level of surveillance, individuals which don’t conform to the normal become outcasts/aliens thus with identity being reflexive the boundaries can be smoothed (Giddens, 1991; Beck and Beck-Gernsheim, 1995)
  2. Public and Private Sphere – Life can be separated into two spheres. The public sphere, which is under administrative power; this has ability to influence both civil and state processes allowing them to develop in union. The private sphere resists the surveillance of the state thus is merely a legislated outpost of the public sphere. An identity spans both spheres as individual no longer only live in the private sphere. For this reason we allow outside influence to make our identity (Giddens, 1991). “[w]hat looks like the outside world becomes the inside of an individual biography” (Beck and Beck-Gernsheim, 1995).
  3. Shame – This is the anxiety that a person’s narrative is not fulfilled, and that it can come under scrutiny by the public. In essence it is a public anxiety state (Giddens, 1991; Warner, 2000b).

From this self-identity is a reflexive project, which is in constant change. Thus in reaction to reflectiveness of modern enteritis, self-identity is the ability to maintain and explore an ever-changing narrative which is constantly being challenge and scrutinised by modern societies (Giddens, 1991).

Looking for Identity and Community Online

Communities online have similarities to offline communities. Through they can be defined as the ultimate removal of place from community, focusing it purely around interests, giving the members membership a sense of convenience (Laukkanen, 2007). People have commented saying it makes people more isolated form their local geographical communities that are said to be doomed to die. However it is an enabler for people who may not have the ability or skills to have face to face interactions with a local community (Haythornthwaite, 2007; Turkle, 1997a).

The internet can be seen as a facilitator and a by product to modernity taking hold; allowing there to be a greater separation of time and space, allowing organizations to have an ever increasing reach, and facilitating greater access to media and information about distant locations through such means as 24hr News channels (Hermannsdóttir, n.d.). It has also help with the move for people to form communities around interest rather than geographic regions as mentioned before.

As mentioned in the background theory social categorization allows for people to become part of groups, however it is also the foundation for discrimination and persecution. This can be seen in the LGBTQ community levels of acceptance within geographical communities depending on the social norms that are accepted. If their gender and/or sex representation adhere to the social norms they can be more accepted in comparison to the stereotypes of  “fem boys” and “butch dykes” (Warner, 2000a). Stigma for sex and sexuality is presumed around the world from political bodies to media outlets. Which has been seen to facilitate the heteronormative ideas of dating, marriage, and reproduction. Thus to establish a understanding of self people try and form a new norm to justify themselves (Warner, 2000b).

Through the development of the Internet there have been multitudes of online communities, all varying in size, shape and technology. Early communities on the Internet where formed around New Groups: this is a form mailing list which users user to communicate about common topics of interest. However as the population of the Internet grew, and more people where using it in there daily lives’ in the 1990’s Message Boards became popular. This then lead to an increase in Chart Rooms where real time communication could take place. Ultimately online communities now form around Social Network site the similar to Facebook or Twitter. All have grown on top of each other becoming more and more sophisticated; allowing and facilitating communication at a greater rate (Woodland, 1999; 1995; O’Riordan, 2007). Currently the youth use networking site like these like no other demographic as they have grown up with the technology which may not have existed 10 years ago (Alexander, 2004).

Youth have embraced these means of communication and community as discussing and identifying to a sexuality online is perceived as less risky, less stigmatized, and can be hidden to a greater extent (Brown et al., 2005), it removes the gaze from the sexuality which in the past meant always having to discourse it and defining it within the bounds of the heteronormative community (Tudor, 2012). It is also the ability for them to locate one another (Gross, 2007), finding refuge in online worlds is that they could be seen as the “town freak” in there real world (Tudor, 2012), and verifying that they are not the only one.  It has been seen through studies of rural community’s where the youth use online forums and chat rooms to find other people like themselves, which similar stores, and same interests (Greteman, 2012).

This high usage of online networks could be seen through the high number of Gay/MsM (Men who Sleep with Men) orientated online chat rooms in the 1990’s (Wakeford, 2002). Youths give many reasons for there usage for these communication tools (Paradis, n.d.):

  • Sexual Identity
  • Same sex friendship
  • Discussing coming out
  • Same sex intimacy
  • “Homosex”
  • Discovering and practicing living within the gay community

This is to over come the feeling of distance and isolation, allowing them to socialize, and discover there non-main stream identity (Paradis, n.d.). This can be seen as users trying to find a meaning for them selves, find a community and gain information (Egan, 2000).

The forms of relationships that are formed within these online communities have been called meaningless and vacuous (Laukkanen, 2007). Academic literature refers to these online relationships as “hyperpersonal interaction” (WALTHER, 1996) which is a intermit and intense relationship with another member, that removes the body, highlights the similarities and reduces the differences between individuals, again making people feel less unique, and that other people have had the same experiences as them (Paradis, n.d.).

Chat rooms and message boards where seen to have a number of discourse topics common amongst them (Laukkanen, 2007), these included:

  • Discourse of Love
  • Discourse of Sex
  • Discourse of (fluid) Identity

Within the Finish forum Demi, there was a separate part of the forum called #closet where they was little preemption about individuals, allowing for their feelings and discussions to become normalized. However by creating these two zones for convocations created two identified for people to live in (Laukkanen, 2007). #closet allowed people to question there sexual and gender identity, changing it from day to day allowing people to safely and openly discuses there feelings.

Through the exploration of looking for new information and identity has meant that for they have a greater understanding about the constructions of identity, allowing them to be more fluid through the use of sexuality labels and gender identities (Paradis, n.d.).

Forms of gender select and gender play has been seen through a number of online spaces, especially in fantasy online role playing games, which has seen up to 60% of participants have participated in this form of gender play (Hussain and Griffiths, 2008). Players gave many different reasons for participating from “I find being a girl is easier in male dominated games” to “It enables me to play around with aspects of my character that are not normally easy to experiment with”, this allows people to experience what it feels like to be the opposite gender (Turkle, 1997c). However this is only seen in fantasy role playing games, and the closer to real life it becomes the more uneasy people get with it (Horsley, 2004).However the community sometimes finds this form of gender paly deceptive, and they feel that they are having a relationship build on deception. “The Stage Case of the Electronic Lover” (Van Gelder, 1991) which was the story of a two year relationship of a woman form New York with an online community which turn out to be a male Physiologist. The male tried to kill of the female persona, however the community loved her so much they tried sending flowers to the hospital; once they found out the truth the felt foolish and betrayed.

However some website even though appear to allow people to have a fluid identity they can be directed into an idealistic image, this can be seen by looking at website for Gay Men/MsM such as which idealizes men into a hyper mescaline image where effeminacy is looked down on (Light, 2007). However this could be down to gay men looking for “hyper mescaline prestige sex” in comparison to heteronormative identities/ideals (Barrios and Lundquist, 2012). This form of awareness of masculinity can also cause anxiety, however it can also increases self-awareness and personal freedom (Horsley, 2004) but it places a high importance on social categorizations (MaGlotten, 2007). Even though these forms of categorization appear to be solid and well formed they do allow for fluidity in defining who you are, allowing you to change representation at will, this can be seen heavly through “closeted” individuals (Mowlabocus, 2008). This form of hypersexual/masculine interaction is contraindicated by young members (Paradis, n.d.) who indicate that sex is not the primary objective of there online committees, this goes against stereotypes. This can be also seen in the high proportion of young LGBTQ members engage in face to face meets on and online communication in comparison to non-LGBTQ members which is not for sexual reasons but for trying to find there identity and self (Paradis, n.d.).

This form of fluidity brings up the question about how identity online is formed, there is a change in the perception of a person as an identity is formed by the mind and not influenced immediately by a body, allowing people to have a relationship with the mind, removing the body and time from the equation. It could be said the mind, body and time into on entity (Turkle, 1997d).

This form of flux allows people to become their “true self”, it allows people to be who they feel they should be (Turkle, 1997a). However this freedom within the online space can allow a form of split personalities to develop, with multiple name, characteristics and traits for online activity depending on the narrative you are playing out online (Turkle, 1997a). However these personalities and identities are taken as a given to the people who interact with them, forming strong hyper-personal relationships between them as mentioned before. This then cause issues when the identities move from online to offline, when the face to face contact is made it end up highlighting the difference between each other (Egan, 2000). This also build on the understanding of parliament of selves, where is the conflict between the identities are created by individuals (Mead, 1934).

As identity is reflexive and ever changing online spaces allow for the constant self-creation and adaptation of narrative in a short space of time. With the formation of a persons identity/narrative being defined by audience, information, and identity (Woodland, 1999). This is treating it more of a self reflective story telling which is more critical of self in comparison to Social Interaction Theory which talks about adapting self for a community, this is more along the lines of finding a group has the correct intra-group relationships which meet the needs of the individual.

Bridging the Online/Offline Divide

Recently they has been a growth in social mobile applications with use a users location for the construction of the community which they have access to. Elements of this can be broken down into three point (Toch and Levi, 2012):

  • Physical Location – Using the location where the user is to construct the community
  • Identity Management – Still a trend to not be recognized on these services.
  • Trust – This is established through text communication with other member of the app. A form of ‘protocol’ is established.

One notable one is Grindr[1] that is aimed at the Gay/MsM community, allowing them to communicate with them in the local vicinity. This is bringing the theory of community formation back to one based around location and interest, rather than the trend of just interests.

However these formats of communication have been linked to an increasing amount of high risk sexual behavior, seeing increase rates of STI transitions between users in such cities as New York and Los Angeles (Beymer, 2012; Landovitz et al., 2012). However these tools have been shown to be effective recruiting tools for STI prevention and testing measures in the same cities (Landovitz et al., 2012; Burrell et al., 2012).

This epitomizes the increased rate at which LGBTQ people meet offline compared to non-LGBTQ, with more indicating they would be will / have met people in person. Citing the same reasons for going online looking for communities as the reasons for meeting offline (Paradis, n.d.). Going against the stereotypes which are commonly held especially for gay men, which has been highly studied in academic literature (Barrios and Lundquist, 2012; Landovitz et al., 2012). However these physical meetings don’t come without physical and social risks, meeting people offline especially with ‘people near by’ apps. These are ‘high risk’, ‘high gain’ encounters which if they go wrong can trigger social embarrassment, emotional harm, and physical risk (Toch and Levi, 2012). However it does allow people to identityfy with a physical group which they could have been looking for all along though online communities (Paradis, n.d.).


As people use technology more in every day life there boundaries between life and techno-life blur becoming the same (Karl, 2007). Does this mean that the Internet which once allowed for people to explore and find communities, allowing their identities to be self-reflexive; allowing the fringes of society to find a voice, community and identity, making them feel less alone which has manifested through hyper personal relationships. Does this mean that the initial hope of bridging the gap and turn the hyper personal relationships into physical relationship will force people more online due to the disappointment with the physical relationships as they are less fulfilling than the hyper personal one. This conflict of self has been demonstrated in the parliament of selves where there will be conflict between the personalities and identities that people construct for themselves (Mead, 1934).

Research Question

Do near people applications allow for youths to interact offline or is it driving people further online through conflict of constructed identities?



Alexander, J. (2004) In Their Own Words

Barrios, R.J. and Lundquist, J.H. (2012) Boys Just Want to Have Fun? Masculinity, Sexual Behaviors, and Romantic Intentions of Gay and Straight Males in College. Journal of LGBT Youth

Baumeister, R.F. (1986) Identity: Cultural change and the struggle for self. Oxford University Press

Beck, U. (1992) Risk Society: Towards a New Modernity. Sage Publications, Inc

Beck, U. and Beck-Gernsheim, E. (1995) The normal chaos of love. Polity Press

Beymer, M. (2012) Grindr and Other Geosocial Networking Applications: Advent of a Novel, High-Risk Sexual Market Place. 2012 National STD Prevention Conference [online]. Available from:

Brown, G.G., Maycock, B.B. and Burns, S.S. (2005) Your picture is your bait: use and meaning of cyberspace among gay men. Journal of Sex Research, 42 (1): 63–73

Bruckman, A. (1992) Identity workshop: Emergent social and psychological phenomena in text-based virtual reality. Available from:

Burrell, E.R.E., Pines, H.A.H., Robbie, E.E., et al. (2012) Use of the location-based social networking application GRINDR as a recruitment tool in rectal microbicide development research. AIDS and Behavior, 16 (7): 1816–1820

Case, S.-E. (1995) Performing lesbian in the space of technology: Part II. Theatre Journal, 47 (3): 329–343

Egan, J. (2000) Lonely Gay Teen Seeking Same. The New York Times Magazine [online], 10 December. Available from:

Giddens, A. (1991) Modernity and Self-identity: Self and Society in the Late Modern Age. Stanford University Press

Greteman, A.J. (2012) Country Queers: Queer Youth and the Politics of Rural America. Journal of LGBT Youth, 9 (1): 63–66

Gross, L. (2007) “Forword.” In O’Riordan, K. and Phillips, D.J. (eds.) Queer Online: Media, Technology and Sexuality. Peter Lang Publishing. pp. vii–x

Haythornthwaite, C. (2007) “Social Networks and Online Community.” In Joinson, A., McKenna, K., Postmes, T., et al. (eds.) Oxford Handbook of Internet Psychology. OUP Oxford. pp. 121–137

Hermannsdóttir, M.B. (n.d.) Self-Identity in Modernity.

Horsley, R. (2004) “MASCALINITIES ON THE WEB.” In Gauntlett, D. and Horsley, R. (eds.) Web. Studies. Bloomsbury Adademic

Hussain, Z. and Griffiths, M.D. (2008) Gender Swapping and Socializing in Cyberspace: An Exploratory Study. CyberPsychology & Behavior, 11 (1): 47–53

Jagose, A. (1997a) “Inroduction.” In Queer Theory: an introduction. Queer Theory: an introduction. New York University Press

Jagose, A. (1997b) Queer Theory: an introduction. New York University Press

Karl, I. (2007) “On-/Offline: Gender, Sexuality, and the Techno-Politics of Everyday Life.” In O’Riordan, K. and Phillips, D.J. (eds.) Queer Online: Media, Technology and Sexuality. Peter Lang Publishing. pp. 45–66

Landovitz, R.J., Tseng, C.-H., Weissman, M., et al. (2012) Epidemiology, Sexual Risk Behavior, and HIV Prevention Practices of Men who Have Sex with Men Using GRINDR in Los Angeles, California. Journal of Urban Health, pp. –

Laukkanen, M. (2007) “Young Queers Online: The Limits and Possibilities of Non-Heterosexual Self-Representation in Online Conversation.” In O’Riordan, K. and Phillips, D.J. (eds.) Queer Online: Media, Technology and Sexuality. Peter Lang Publishing. pp. 81–101

Light, B. (2007) Introducing masculinity studies to information systems research: the case of Gaydar. European Journal of Information Systems, 16 (5): 658–665

MaGlotten, S. (2007) “Virtual Intimacies: Love, Addiction, and Identity @ The Matrix.” In O’Riordan, K. and Phillips, D.J. (eds.) Queer Online: Media, Technology and Sexuality. Peter Lang Publishing. pp. 123–138

McMillan, D.W. (1976) Sense of community: an attempt at a definition.

McMillan, D.W. and Chavis, D.M. (1986) Sense of community: A definition and theory. Journal of community psychology, 14 (1): 6–23

Mead, G.H. (1934) Mind, Self, and Society: From the Standpoint of a Social Behaviorist

. vol. Morris, C.W. (ed.). The University of Chicago Press

Mellor, P.A. and Shilling, C. (1993) Modernity, self-identity and the sequestration of death. Sociology, 27 (3): 411–431

Mowlabocus, S. (2008) Revisiting old haunts through new technologies Public (homo) sexual cultures in cyberspace. International Journal of Cultural Studies, 11 (4): 419–439

O’Riordan, K. (2007) “Queer Theories and Cybersubjects: Intersecting Figures.” In O’Riordan, K. and Phillips, D.J. (eds.) Queer Online: Media, Technology and Sexuality. Peter Lang Publishing. pp. 13–30

Oakes, P.J., Turner, J.C. and Haslam, S.A. (1991) Perceiving people as group members: The role of fit in the salience of social categorizations. British Journal of Social Psychology, 30 (2): 125–144

Paradis, E. (n.d.) Searching for Self and Society: LGBT Youth Online.

Tajfel, H. (1979) Individuals and groups in social psychology. British Journal of Social and Clinical Psychology, 18 (2): 183–190

Tajfel, H. (1982) Social psychology of intergroup relations. Annual review of psychology, 33 (1): 1–39

Tajifel, H. (1979) Differentiation Between Social Groups.  Academic Press, Inc

Toch, E. and Levi, I. (2012) What can “people-nearby” applications teach us about meeting new people? In September 2012.  ACM  Request Permissions

Trepte, S. (2006) Social identity theory. Psychology of entertainment, pp. 255–271

Tudor, M. (2012) Cyberqueer Techno-practices: Digital Space-Making and Networking among Swedish gay men.

Turkle, S. (1997a) “Aspect of the Self.” In Life on the Screen: Identity in the Age of the Internet. Simon & Schuster. pp. 177–209

Turkle, S. (1997b) Life on the Screen: Identity in the Age of the Internet. Simon & Schuster

Turkle, S. (1997c) “TinySex and Gender Trouble.” In Life on the Screen: Identity in the Age of the Internet. Simon & Schuster. pp. 210–232

Turkle, S. (1997d) “Virtuality and Its Discontents.” In Life on the Screen: Identity in the Age of the Internet. Simon & Schuster. pp. 233–254

Van Gelder, L. (1991) The strange case of the electronic lover. Computerization and controversy. Boston, MA: Academic, pp. 364–375

Wakeford, N. (2002) “New technologies and “cyber-queer research.” In Richardson, D. and Seidman, S. (eds.) Handbook of Lesbian and Gay Studies. SAGE Publications. pp. 117–144

WALTHER, J.B. (1996) Computer-Mediated Communication: Impersonal, Interpersonal, and Hyperpersonal Interaction. Communication Research, 23 (1): 3–43

Warner, M. (2000a) “The Ethics of Sexual Shame.” In The Trouble with Normal: Sex, Politics and the Ethics of Queer Life. Harvard University Press. pp. 1–40

Warner, M. (2000b) “What’s Wrong With Normal.” In The Trouble with Normal: Sex, Politics and the Ethics of Queer Life. Harvard University Press. pp. 41–80

Woodland, R. (1995) Queer Spaces, Modem Boys, and Pagan Statues: Gay/Lesbian Identity and the Construction of Cyberspace [online]. Available from: [Accessed 17 April 2013]

Woodland, R. (1999) “I plan to be a 10”: Online literacy and lesbian, gay, bisexual, and transgender students. Computers and Composition, 16 (1): 73–87



Web 3.0 – The Internet which is to good for people

Search engine have become more advanced with the methods that they use to index webpages by over the years, becoming more sophisticated, more trusted, and more technical. The first generation of search engine crawled the Internet indexing words on each page; this could easily be hijacked by filling pages with the same words thus achieving a high ranking for when people searched for that term. This was a technique used by porn sites two-drive traffic to them.

Google, Alta Vista and Yahoo started innovating in the field; building their indexes not based on word counts on pages but instead using such metrics as links to the web page, and how trusted the source of the link could be. This lead to more reliable results, taking into account aspects of reliability and trust from content sources on the Internet. However when indexing content there is limited understanding by the system about what that content is about, not being able to determined if it is a person, an event, or a location which is being discussed in the content being indexed.

These questions are trying to gain semantic information from the web pages, allowing computers to greater determine the meaning of the content. This builds on the definition of Web 3.0

Web 3.0, a phrase coined by John Markoff of the New York Times in 2006, refers to a supposed third generation of Internet-based services that collectively comprise what might be called ‘the intelligent Web’ — such as those using semantic web, microformats, natural language search, data-mining, machine learning, recommendation agents, and artificial intelligence technologies — which emphasize machine-facilitated understanding of information in order to provide a more productive and intuitive user experience.


With semantic web deign defined by W3C as

“The Semantic Web provides a common framework that allows data to be shared and reused across application, enterprise, and community boundaries.” and Tim Berners-Lee described it as ”If HTML and the Web made all the online documents look like one huge book, RDF, schema, and inference languages will make all the data in the world look like one huge database”

There is a movement to create ontologies for the Internet; this is being achieved by the creating of OWL (Web Ontology Language). These languages would be ways of representing knowledge and data within specific domains. The languages are based on standardized RDF/XML serialization and by formal semantics, which has all been enforced by the W3C. This way of reining information and knowledge has been embraced by the medical industry

SPARQL(SPARQL Protocol and RDF Query Language) is a query language for the semantic web. It is a RDF query language for large databases that can retrieve and manipulate data stored in RDF frameworks. It is one of the main languages that is used to query ontologies, and has been implemented in many languages, along with tools allowing it to be translator into other query languages such as SQL and xQuery.

Several large institutions within the UK have implemented endpoints in their systems that allow for querying of data with SPARQL. These include the UK government at and the University of Southampton

As said above it is about referencing knowledge and information between objects and domains. This is where RDF comes in as it was developed by academic for artificial intelligence, where everything has to be cross-referenced to gain meaning. However as it was made by academics it is harder to understand than XML or JSON as it was not designed for readability. An example of how this form of system can be used is the “I want to sell / I want to buy” example. If a person is looking for a phone they would use a application which would search though the internet like a crawlers does for a search engine looking for RDF files which contain the information about people wanting to sell a phone. This would be done in real time. However say you want to sell a phone you would again use an application, which would create the RDF files, however this would also have to include information such as pointers to the information about the manufacture of the phone, potential dealers of the phone, components within the phone, all tying it into the sematic web. There are several issues and questions including: how do we find the files, how do we search, what if people call things by different name, this is where the Ontologies come in, this allows for look up for relevant information about a topic, allowing the application to identify similar or relevant knowledge on the Internet.

I personally fell that this implementation of a true semantic Internet is a long time off. Yes there will be stores that store information in RDF format, and yes there will be end points for specialist SPARQL dataset such as governments and open data. However I person feel that the current form of the Internet is too well established to move on quickly. If we do move to the semantic Internet we are going to need a number of large players within the Internet to start adopting and developing for it, companies like Google, Microsoft, Facebook, and potential trust companies like VeriSign.

I do feel there will be a push to start using the technologies for use in social networks as data protection and privacy become an issue, as technologies mentioned in the semantic Internet would allow for people to potentially store there own data, but still allow it to be used and discovered, turning currently ‘big data’ into self managed big data overcoming common privacy issues which are currently seen with big data.

If it is implemented end users may not realise as it would more likely be implemented at the data layer, potentialy allowing a few more features to the end user, but mainly offering benefits to data manipulation, search and storage. This will most likely mean that if and when it gains ground Web 2.o will still exists.  Web 2.0 will become the easly accessible form of development for newbiew on the Internet, where has Web 3.0 will be used for the big business system which can invest in the technology which will ultimately redefine the meaning of Web 3.o in a post hoc definition. Thus currently the moment of Web 3.0 is firmly bound in specialised implementations until the technology is accessible enough for mass deployment.


Bitcoins – Born To Die

Through out the past few years there has been a crisis in trust in the Global Financial system. One of the main causes has been the global financial crisis since 2008. At the same time there has been a growth in interest in alternative currencies, which are not backed by the government but by other institutions. The majorities are usually local to a geographical reason, such as the Bristol Pound, and Liberty Dollar.

One of the digital currencies has gain prominence and a large market valuation recently is Bitcoin which has recently seen market caitalizations as high as $2614956613.67025 through online trading.

Bitcoin Market Valuation

Bitcoin started in 2008 when a paper was anonymous published on the foundations and theory, which would underpin the currency. It outlined the foundations of the currency based on cryptographic principals, which is decentralized. This can be broken down into two systems:

  • Transaction - all transactions are recorded in a public ledger which is hosted on each bit coin mining clinent.
  • Mining - each transaction is hashed into a block, which is then distributed round the bit coin network for them to solve, once one has been solved the first person to do so is awarded an amount of bit coins (However rewards reduce over time). This is then replicated through the network to verify the transactions, any transaction which can’t be verified is rejected. However as the block have more transactions hashed into them it become harder to solve the hash.   m

Due to it initially being a geek lead community with a hardware, sense of a new economic movement, and compotation to main the coins has, lead to leader boards and forums based soul on the rigs and hardware specs, ranking them on the power consumption, processing power, and hashing rate. These boards included normal home PCs which had been specked out for gaming, to specially designed systems which featured arrays of GPUs.

As mentioned before due to the reducing number of bit coin per block, and the increasing difficulty, meaning that the number of hashes required to break one block increase, yielding a lower return. Thus the amount of processing power required to mine a coin will increase. But processing power is not a infinite recourse and is limited by cost to run the riggs, thus to make it a profitable endeavor the market valuation of a bitcoin must exceed the value of the rig and energy used to mine the coin. It has been estimated that $15000 a day is spent on energy just to mine bit coins, this would mean at a current valuation of $106 usd/btc there would have to be atleast 1415 bit coins mined a day to make it profitable. Today 166 block where brocken in the last 24hrs, releasing a sum total of 4150 coins, thus still making it a profitable activity.

However the equilibrium point will drive individuals to mine them more efficiently using less power as the number of bitcoins per block reduces, and the number of hashes per block increases  this is where custom made hardware is used as they require less power than GPUs and CPUs. However due to the nature of the community and the drive to gain them there have been a number of reports innovative ways of mining them, these include reports in 2010 or large institutional super computers / clusters being hacked into to be used to mine coins leaving no trail of the perpetrator, and recently a system exploiting a security hole in Skype allowing software to be installed using all spare capacity of the computers CPU to mine coins.

Some of the recent demand for the currency is from the looming and on going finical crisis in the western world, recently in the freezing of bank deposits in Cyprus. This is mainly due to people seeing bitcoin as free from government control, thus any currency they have within bitcoin can no be frozen by any country. However there are issues with the currency not being backed by governments or banks, thus giving it relatively limited liquidity. Also with the massive increase in value of bit coins means that even though this is seen as good, in reality the bitcoin economy has been going though a massive devaluation, thus instead of people spending the currency for which it was designed people start to horde the currency as they see it as an investment mechanism instead of a transactions mechanism. Thus the horning of the coins is reducing the supply, which in turn is increasing the market value.

Anything can be used as money, gold can be used as money, beans can be used as money, it only requires people to accept the currency, and to trust the value of the currency. This is an issue for bitcoin currently as very few places accept the currency, as it lacked the backing of the finical institution to process the transactions. This does not mean it is not a currency, but it is the same as other goods that can be exchanged as value due to it lacking ‘moneyness’.

However there is skepticism about the system from many people, even though it is gaining attention it is still a very niche currency which even though has been increasing in value recently has in the past been very volatile recently dropping over 50% of its market capital. Volitlility in the past has come from companies vanishing which have stored users bit coins, and when there has been a floods of coins onto the market and recently issues with the mxGox the largest bit coin training site having a large number of DoS attacks.

One of the main issues which I believe will effect the value of the currency is once they get to a level of scarcity where the community which compete to mine them will vanish; thus a large proportion of the community will vanish as they are using it a justification to compete in building these powerful mining machines, thus when there is no point in mining causing them to move onto a new justification for these machines. Thus there will be a decrease in demand for the coins, forcing down the valuation, basically making them workless, however if level of moneyness’ has reached a level which can sustain the currency then the valuation my not be relevant as much any more.

An interesting correlation has been shown between the valuation of coins and the number of searches for ‘bitcoins’, thus once people are board with it … it could die.



NOSQL – The Famous Four

Relational Databases (RDBMS) have been the stable of data storage systems for a long time, however these have been joined by a new generation of database which come under the category NOSQL. NOSQL was coined by Carlo Strozzi in 1998 when describing his light weight, open-source relational database which did not have an SQL interface. However it is commonly understood that NOSQL means Not Only SQL (there is sometimes a belief that it means No SQL). Carlo believes the movement should be called NoREL (Not Only Relationship) as the databases are loosing the relational aspect not only the SQL. It is understood that rows and relationships are not the most efficient way to store data for a lot of applications. However they are usually more specialized in comparison to RDBMS, usually there are little built in functions in the database in comparison to RDBMS which has a multitude such as time functions, caparison, search. They usually try and achieve a greater scalability in comparison to RDBMS however this means that they lose some fundamentals of databases that have come to be relied upon.

The four most common forms of NOSQL are listed bellow:

Document Store

This takes the commonly understand concept of a scheme-less document as the container for the information. Meat data is then attached to each document, allowing them to be accessed retrieved and organized. This can be done through collections, tags, higherachical tree of documents and metadata.

As with any form of NOSQL systems there can be similarities drawn between them and RDBMS systems. Here collections of documents could be seen as a table, with each row being a document. However there is no guaranty that each document would have the same fields within it as it is a shameless.

Retrieval of documents is based on a ‘key-value’ system where each document has a key, however different systems allow for different queries that search the content of each document.

There a number of well-used documents based databases, the leading being MongoDB. This is an open source system that is supported by 10Gem. Their systems uses JSON to store the documents, with dynamic schemes, thus allowing documents to have different fields and values. This has been adopted by a number of large companies such as MTV, Craigslist, and Foursquare for the easy and speed at which they are able to access and modify documents which can represent real entities better than a traditional relational system can.

Graph Databases

These have been designed for systems where the data is best represented in graphs, with nodes, edges and properties. This has been using in representing public transport systems, roads, topologies, where there is a need for representing undetermined number of connections between points in an easy and simple methods. Nodes are similar to objects stored within object databases, allowing IDs, names and other inflation to be stored about each node. Where as edges are what the nodes are connected with, they represent relationships between them.

For associative data sets they all for a performance increase over traditional relational databases, allowing the structure of system to be mapped more directly. Like document store systems they have a less ridged schema which means they can adapt to storing different types of objects, e.g. bus stations and bus stops may require different types of data. They allow for more powerful queries of the graph compared to RDBMSs as they use the full power for graph theory, and do not have to abstract to above a relation system. Neo4j is currently the leading system user for Graphical Databases, it is written in JAVA however has inbuilt APIs and wrappers allowing it to be accessed from different languages and systems.

Key-Values Store

As with all the systems mentioned so far it again is shameless. These are similar to key value pairs used within data structures in all the large programming languages such as JAVA and Python. This is where a key is associated to a value, and to access the value one must look for associated key.

However for each value it must have a unique key to retrieve it by. This allows for quicker accesses, which is seen within simpler data stores as instead of having to query nodes which could be across multiple systems, it allows for direct access to the key and thus the value as it is a flat structure. Due to the simple nature of the structure of the data means the code to access is also very simple in comparison to SQL. However as RDBMS have tables it means access control can be implemented, however this is more challenging to implement here as there is no separation of data.

This category can be broken down into a number of sub categories as this form of data store can be manipulated for different mean. Some of the main variants store the data within memory, allowing fast access, however some store it to disk, allowing for data to be maintained. Redis is one which is the leading the key-value database movement it is  open-source, networked and stores data in-memory.

Object Databases

Object database (OODBMS) systems store information in objects in much the same way object orientated programming languages (OOPL) do. This allows object to be created, store and retrieves, within out any need of converting them into another form of query that would have been needed before when storing them in traditional DBMS. As they are so heavily integrated with the programming languages it means that the scheme can be maintained and manipulated within the same development environment.

In the early days it was seen that OODBMS added persistency to object programming languages. Like any database they have give the developers the ability to query large set of objects with a simple language. Objects are usually retrieved by following pointer, thus there the need for doing expensive joins on tables is removed, thus reducing the access time to the data. There have been many features added over time that are only possible by making object persistent, such as versioning, allowing developers to access past states of objects.

Wide-column Databases

These are all based on column family databases which are used for extensively large data sets which have large access overheads if stored within rows thus these types of systems are designed to scale to petabytes in size. These can be compared to RDBMSs:

  • Table in RDBMS are one or more column family’s (column family being a collection of columns)
  • A column in a column family would be a triplet consisting of a key, and a value.

The reason for serialising the data in column instead of rows is for performance gains. Firstly if you only want to query three out of ten columns in a table you only have to access three out of the five in comparison to data which is stored in rows which require the whole table to be accessed. This then also makes performing tasked on one column at a time quicker as again the read and write time to access the column is reduced. Also for adding and removing large amount of data at a time there are performance gains as you add or remove the column, thus not having to ideate through each row. However if there is the need to access one row in the data sat multiple operations will have to be performed, however these are usually limited in large datasets.

There are many variation of wide-column databases, with Google’s BigTable being the one which is credited for starting the trend in using this type of database design. Other notable ones are HBase, Cassandra and Hypertable, with Facebook having used introduced Cassandra for storing vast amounts of user information across multipul servers.

As you can see there has been a fragmentation and specialisation in the database field. This has lead to many improvements in systems, allowing developers easier systems to store data in which it is more suited to them. However this has lead to a fragmentation in knowledge about each system, initially RDBMS had SQL as a simple and universal language which allowed anyone with the knowledge of SQL access to the data within the majority of RDBMS; but now with the fragmenting there are many different ways of accessing and setting up these data stores which require a specialist knowledge. This is also needed when deciding which one to user.  I also agree heavily with Carlo Strozzi, as I have showen it is not the removal of SQL which is defining the field but the removal from the traditional relationship model of Databases to a ever increasing use of schemes-less systems which should allow for more specialised and adaptive development of existing and new systems.



Mashups and APIs – Breaking down barriers

Since the early 2000’s their have been many developments within the web development. This has come in the form of economic, social, and technological innovation within the online space. These new changes have been used for many different means, private and public.

To try and define this change the phrase Web 2.0 was coined. The first mention was in 1999 where it was described along the lines of more screens, more pipes, more information in an article by DiNucci; however in the current context this could be seen as the Internet of everything definition. But more understood definition was given by Tim O’Reilly in 2007 in reflection to events. He defined it by 7 chariteristicas:

  • Services, not packaged software, with cost-effective scalability
  • Control over unique, hard-to-recreate data sources that get richer as more people use them
  • Trusting users as co-developers
  • Harnessing collective intelligence
  • Leveraging the long tail through customer self-service
  • Software above the level of a single device
  • Lightweight user interfaces, development models, AND business models

This was achieved by giving a narrative to the business sector, however he talked about Web 2.0 businesses having already existed pre dot-com bubble, and that these where the one which survived as they drew on collective intelligence. However it has been  acknowledge that these companies existed however distinctly identified that these companies became Web 2.0 companies after the dot-com boom as this was when the companies embraced the nature of social along with the advances in different technologies and the emergence of agile development frameworks. This is backed up by academics defining it as a paradigm on how people used the internet, along with making it more accessible to none technical users. Thus a common narrative is of utilising a social dimension, this has then lead the Internet from a read only systems where data was consumed to a read/write where the users are helping to generate the data. This is fulfilling the original dream Tim Berners-Lee’s had about the World-Wide Web.

The changes in communication and visualization technology has lead the Internet gaining more Hackability. This has lead to mash ups gaining prominence in the online development. A mash up traditional has been defined as a combination of 2 of more original sources to form a new entity, but Web 2.0 it is the creation/reuse of technologies and data bring them together to make a new application, widget or service. However the type of mash ups can be broken down into three different types of mash ups: presentational, data functionality, and process. All definitions come from the same under standing however vary on the granularly at which they are used, with all increasing in complexity as they become closer to the data and system layer.

In the mid 2000′s there was an explosion in tools which made the manipulation of open XML and JSON data souces on the internet, these included:

However over the years they have been closed as there popularity dwindle and companies move one to other products. But at the time they where an innovation which opened up the masses too online data manipulations. Yahoo Pipes was by far the more popular which had a number of modules which allowed users to manipulate and search the data streams, these included:

  • locations – search by location
  • string – allow for regex of string
  • date – manipulation of number and text representation of date
  • Number – applied simple number operation to numeric input.

There where a number of cool implementation of pipes for cleaning twitter feeds, and for pulling out location data from news articles.

There are hundreds of companies supplying data feeds for structured data in XML or JSON format which can still be used. Some of the more popular mashups over lay data on to mapping systems such as Google Maps, one popular one is crime reports for areas in the UK.

Recently there has been an explosion in the number of API which are being offered by online companies to use there services. These have come in a number of different forms allowing different implementations within code; SDK’s which allow for navy integration into specific environment and languages such as Java for Android, or calls to URL which could implement REST functions or access points to data which then have to be parsed by the client. This has allowed developers greater access to large commercial services which can help them develop there application and product with minimal effort. This can be seen though the use of PayPal as a payment service, this can be integrated within a minimal amount of code to an application, this can then help a developer monitze there application, or sell products which in the past could have meant that they would have had a large number of hoops to jump though. Anouther example is by using Facebooks SDKs allows apps to be integrated into the largest social network in the world thus increasing there exposure to the market.

However by having a high dependency on external systems which are developed outside of their control exposes developers to a high risk. This can come in a number of forms, first quality of service to end users may be effected, if users can only log in with there Facebook accounts and Facebook goes down then users cann’t log in, or if the provider is under a large amount of volume can lead to a reduced response time, thus slowing down your service. Other risks could be changes by the supplier in their service such as removing a function which you are using could render your system unusable, or if limits are placed on calling there service could render your system unusable for some people once the limit is exceeded, meaning you would need to deal with refactoring your system in implement changes, and unhappy users as your service become slow of unusable. Thus using any for of external systems should be taken with a pinch of salt. Even with these issues they can still bring a large number of advantages to systems such as increased funcatinality, reduced cost, reduced development time.

However these levels of system integration require a greater knowledge of systems and programming than compared to the simple data mashups which where mentioned before. But all that has been mention ties into the definition of Web 2.0 which was given by Tim O’Reily in 2007, it is

  • Leveraging the long tail by providing by providing access to historic service and data
  • Services not software as companies rely on payment system provided by other people
  • Control over data sources comes from the companies which have developed the system people are using to
  • Leveraging collective iteligance and treating users as co-developers comes from allowing people to mashup data in the first place.

Web 2.o, APIs and mashup tools are developing a more open and sustainable environment for content and service creation and consumption as it starts to break down barriers by harnessing the power of the internet allowing small and big player to co-exist benefiting from each other specialities.



The Long Tail of a Hipster

There is a constant drive of innovation in technology, making it faster, smaller and cheaper. All sold with the premiss that it is better than the last iteration which was released 5 months ago. This though does not make any previous technology irrelevant, as there is an ever growing long tail of technology. A long tail from a cultural perspective it has been defined by Chris Anderson as “content that is not available through traditional distribution channels but could nevertheless find an audience.” meaning that even once content has been distributed through traditional means it can still be acquired through alternative sources such as VOD (video on demand). By enabling the access to content and technology long after the original distribution allows for a second wave of meaning to come from it.

in the history of technology, it is not rare for technology to gain significance long after their inception

Theodore Adorno (1969: p. 283)

This form of long tail is becoming significaly more important in current culture and technology. Does the fast movement forward of technology and culture mean that we take comfort and nostalgia from old mediums; does it mean we still value them so much we can’t bare to let them go, is this the beginning of a post-modern technology movement/enabler? Whatever it is it is an ever growing trend with some notable examples:

  • Instagram: An app for Android and iOS, and a social media photo sharing site which allows the application of ‘vintage’ filters to images, making them look like they where taken on old film cameras. Also allowing them to be shared to the masses on other social networking sites.
  • Lomography: Old film cameras which reproduce image as though there where taken with granny cameras. Allowing you to buy multiple different one, accessories, and again share them online.
  • Vinyle Records: These have been on a steady decline for year, however they are still the go to medium for DJs. Recently the sales have sored, with people are now willing to pay on average £16.30 for one in comparison to £7.82 for a CD in a shop.

All of these movements started out small but have eventually had a boom phases. Instagram now has upwards of 100 million registered users and is still growing. These initially could be classed as cult (counter cultural) movements, only attract the people willing to hold onto the technology and belief in the technology. However through these techonologys being legacy and iconic has this meant that these sub culture(cults) are accessible to the masses? Does making a cult into an app mean that people can be a member of the cult though a glass screen? Through people trying to be individual does it mean that cults become the norm? Do they become main stream, meaning that the person next to you is as induvialual as you, which is not that individual at all. However as they are contained to a device or product which can be concealed, means that internally you believe you are individual, and can be individual at will, individual when you want to be, breaking the rules of the norms at any time, when infract any one can do that, thus the hidden individualism is just a hidden norm.

Does this retro furterisum allow for a post hock reflection on technology? Does it allow use to discover and instill romanticism into technology when once it may have been seen as clinical and impersonal. Sales of the Soda Streams, and hostess trollies are on the increase after they have been introduced with new sleek slender and easier to use designs, is this a throw back to a time we think was better, when we entertained and made out own pop. There was a reason these vanished for a time, and are we forgetting this vanishing and just remembering the nostalgia, this could definitly be said for film cameras; where production of film is now at an all time low, and is at all time high prices but people still buy them for vintage camera where the cost for getting the final picture is significantly higher than using digital. Thus are these reinventions/cults doomed from the start?

It could be argued that these mediums become an accessible art form for the masses. Once they have been commoditized such as 35ml film and vinyle, and then decommoditized there is still the cultural demand (cultural commoditization) for them and that the knowledge barrier for there use has been over come. Thus as cultural value is high and supply is low they start to be valued as an art form, thus vinyle is place on the wall for it’s cover art, and people treasure the grainy photos taken with a Lomography camera, even the bad ones.

Does this mean that for technology to gain human meaning we should follow the hipster. Are these brogue wearing, gin drinking, vintage cloths wearing the trend setters for the social retro technology movement, indicating which technology has significant cultural meaning and limited supply thus making it is a new form of pop art.


Devops – What is it?

In the past 10 there has been a drive in software engineering to use agile methods such as:

Agile in general allows for the whole process of software development to be more reflective, allowing the development of adapt to the ever changing needs of clients, customers and anything else which the world may throw at the development team. Each of the methods mentioned have different characteristics; scrum is characterized by standing meetings, and XP has very fast release cycles through development cycles at different stages.

Traditional software development has operated independently of the IT/Systems admins teams within companies, thus the application would be developed independently of the systems which it would untimely run up on, or it would be developed for a specifically designed system which would change once in a blue moon. This forms of regimented development can see through the flow of the code once it has been signed of by the developers before it does into production.

Code moving from departments before deployment

Code moving from departments before deployment

This could then mean that once quality assurance has been pass for the changes and it has then been passed onto the system team it could then be changed again to allow for it to be deployed on existing systems.

A juxtaposition can be see through the motivations of system administrators and developers.


  • Deliver change in accordance to requirements
  • Business depends of delivering change

System Admin 

  • Maintain Stability
  • Fight off change
  • Reduce Downtime
  • Change is the Enemy
  • Increase reliability

Both act in isolation within their own domains within the company, focusing on their own goals with their own specific tool sets to achieve them. When issues arise the blame and fixes are moved between  Developers, System Admin, and Quality Assurance teams, with no one knowing what to do all believing it is someone else.

This leads to risky deployment where people don’t have belief in the code whether or not it will run on live environments as people only believe it would work on the machines which it has been developed on, fearing change once the code has been deployed. It also slowed down the release cycles as even though developers may be working to short release cycles in Agile; the system admins would rather release code into production in larger iterations rather than smaller ones, as it means that they have a greater time to test it on systems, and requires less work.

Developer and System Admins Release Cycle

Release Cycle – from

Within System Admins domain their have been many innovations that have allowed for modernisation. Virtualization and Cloud Computing (Infrastructure and Platform as a service such as have been embraced thought the industry, this allows for provisioning of areas of the system for development on the same system as deployment, isolate applications from each other, allows faster provisioning of resources, and reduces hardware tie in for data centres. This form of innovation was allowed through addition of layer of abstraction between code and hardware, allowing hardware problems to now become a software problem.

This in turn has influences the formation of DevOps (Developer Operations) which has seen an increase in popularity side about 2008. The basic concept behind DevOps is to being the Developers closer to the System Admin roles, and vies versa. Some large companies use this method and have increased there relies cycles to 10 to 20 per day, as demonstrated by Flikr.

DevOps is the theory / opinion / movement that by bring the development of code closer to the development of the info-structure you can remove and improve some of the problems previously mentioned. It has to be understood that DepOps is not just a person in the company which bring people together, or a set of processes for problem solving, it is an overall belief and philosophy for developing, deploying, testing, maintaining, using, and implementing code. It is the belief that there is limited too no difference between the System Admin (System Admin Teams) and Software Developers (Software Development Teams). This is breaking down the barriers between teams and departments allowing developers to understand what they are developing on and the system admins deploying environments that developers need. This means developers should be able to move from one language to another, from objects in java to functions on cron jobs. This is trying to limit the distance between development completions to live in production.

Even though it is not a person, some people embody it more than others, enabling DevOps for other people. For a person to embody the DevOps philosophy they need to have a multi discipline skill set, willing to write code, and set up deployment environments. As they sit within multiple campuses they are the ones that make the connections between people, facilitating communication, peace making situations, and being good will ambassadors. This allows for cross discipline teams to come together and work as one entity.

Currently tooling amongst the teams is different, but by unifying the tools allows for a greater understanding the different roles. This can be done through:

  • Unified process
  • Unified tooling
  • Version Controlled Software Libraries and code
  • Deeply Modelled Systems
  • Automation of manual tasks e.g. increased use of cron jobs
  • Virtualization
  • Cloud Computing
  • Iaas (Infrastructure as a Service)
  • Paas (Platform as a Service)

This in essence means that developers who blame the System Admins for creating an unreliable environment and the System Admins for blaming the developers for creating unreliable code become mute as they become intertwined, and the ultimate issue of maintenance and quality control become interweaved into a true agilely method.

NLP and online advertising

This dissertation was an inquiry and proof of concept on using NLP techniques for context based keyword association for online advertising.

The proof of concept used the NLTK Package for Python which allowed for rapid development and prototyping of the system. The system has a slow set up as the copra are loaded into memory, however when used it is speedy, but not at production level.

The system mainly used keyword tagging with some algorithms developed in Lancaster University. However the report dives deep into many of the methods used within corpus based linguistics.

The conclusion was that yes it is possible however there would potentially need to be more meta data tagging pages before had, and that a greater usage of statistical methods rather then tagging methods may allow for greater accuracy.

Project N-Fly Report

Project N-Fly Source Code

Copy Right held by Daniel Kershaw