John Schroeder states that Hadoop is the most important technology for businesses to use in the Datacosm created by big data. He states, “In the Datacosm it does not make sense to store data separately from the processing.” Hadoop is a software that stitches together commodity servers into a big data platform that be scale linearly and easily. The distribution for Hadoop provides an enterprise-grade platform that offers the ability to create target marketing applications, improve accuracy and timeline of fraud detection, and scale operations with at least 10-to-1 cost efficiencies over traditional servers. Hadoop continues to develop to become more powerful in dealing with the Datacosm that is continuing to grow.
Application Specific Integrated Circuits (ASICs) and Field Programmable Gate Arrays (FPGAs) have been made possible by advances in integrated circuits to provide higher performance and faster speed. Complex mathematical procedures for data analysis has been a real-time realization complex algorithms. Ashkan Ashrafi states, “The need to use hardware implementations of digital signal processing algorithms is exponentially increasing due to the explosion of stored data and the necessity of analyzing these data in less amount of time.” These algorithms cannot reach their desired speed using computer software alone, because it would lower the processing speed. To reach this goal it is proposed to use a combination of low-latency parallel digital phase locked loop (DPLL) with a feed-forward carrier phase recovery algorithm (CPR). This would compensate for a carrier frequency offset and frequency fluctuation.
Communications networks play an increasingly important role in global military operations. This dependency creates several challenges. Providing secure communications is the single most important aspect of Network Centric Military Communications. There is a new NATO program that “provides transport services via a routing infrastructure in which network layer encryption is employed is extended to address the case in which a multinational transport network is required.” The difficulty is providing the specific performance required while providing reliable network security. Another issue is protecting packet-level communication from malicious and unintentional interference. A way of fixing this is the implementation of special groups to minimize these threats and the use of mobile ad hoc networks (MANETs). Research is continuing to mitigate threats on military communications networks while providing the specific performance that is required.
Thomas Limoncelli describes 2 successful IPv6 (Internet Protocol version 6) rollout strategies and 1 unsuccessful rollout strategy primarily for businesses. IPv6 is an extended address format that allows for 128 bits of source and destination host addresses. That is enough for 340 trillion trillion trillion addresses that is enough to last for the foreseeable future. The need for IPv6 came from a depletion of IPv4 (Internet Protocol version 4) addresses. IPv4 only allows for 32 bit addresses, 4.3 billion potential addresses, which seemed sufficient at the time. After the internet went public and grew substantially more address space was needed. The IPv6 format is not backwards compatible with IPv4, because it does not have the 128 bits of address space needed to refer to an IPv6 only destination. This calls for the implementation of a dual stack design that allows hosts to speak to either protocol. Without investing in IPv6 compatible technology businesses would be missing out on a large client base as more addresses immerge. Limoncelli does not recommend to convert everything now. He does recommend to purpose a high-value reason to use or convert to IPv6 to have the funds allocated. He also recommends the use of a load balancer that does IPv6-to-IPv4 translation to offer IPv6 to external customers immediately.
March 1, 2014 Bryan Betts determined that security and privacy are the two measures that can determine the best internet browser. Malware infections and phishing attacks are the most common threats facing users today. The internet browser is a user’s first line of defense against these threats. To be effective an internet browser must constantly update lists of sites determined to be malicious. When a user goes to a website it can collect information that is stored about the user and their computer. They typically install HTTP cookies which are “small pieces of code enabling the site to keep track of you.” Betts goes through the positives and negatives of the most popular internet browsers Google Chrome, Microsoft Internet Explorer, Apple Safari and Firefox.
Jaigris Hodson conducted a discourse analysis, which is described as an analysis of language, of Google’s weblog from 2006 to 2011. Hodson found that Google leaders constructed the company in two ways. One was “the company as a provider of valuable (commodified) information, and the second construction was the company as a public service or an information utility provider.” Hodson also determined that Google values information over anything else with the “Strong belief that technology is the source of human progress and that technological development is inevitable . . . the unspoken subtext suggests that if people can only get access to the information they need, when they need it, via Google’s search product, then they will be happier and more successful.” By making information a commodity to make money off of advertisements Google marginalizes the workers who create the technology and the users of their product. One way this happens is by prioritizing search results with Google companies at the top. This ultimately results in two classes of people, “those who control information via their command of technology, and those who rely on others to do it for them.”
Derrick Kerckhove conducted an interdisciplinary study based on specific case studies rooted in social sciences, System Engineering Technology, Data Representation, and Science of Networks Complexity in order to create a set of comprehensive features that shape the personal and social sense of digital selfhood and identity. These features are working towards defining the digital persona and represent its complex nature. Kerckhove asserts that there is no inclusive definition of the digital persona at this time due to issues that arise from the four agents that affect the digital persona. These four agents are personal agents, technological agents, institutional and legal agents, and civic agents. The definitions that exist today do not take all of these agents into account and are missing a European Framework which is described as “a point of reference at the EU level regarding digital persona that includes identification, authentication, legal and ethical subjective identity management.” Kerckhove outlines his issues with the current definitions and promises his continued research to build on these findings and provide a comprehensive definition that includes all layers and sectors of the digital persona.
In 2006 Van Niekerk defined a “Digital Asset” as, “any item of text or media that has been formatted into a binary source that includes the right to use it.” The “Digital Asset” was born because of the power of “Digital Citizenry.” As “Digital Citizens” people should be able to legally keep, transfer, use, sell or inherit “Digital Assets” just like any other asset. However, due to a lack of definition, law and regulation people’s “Digital Assets” are not protected. In addition, “Because of a lack of legislations and regulations, the concept of ‘Digital Asset’ causes ambiguity between the digital account service provider and the account users.” Alp Toygar asserts that service providers use unethical practices to deal with these assets. Even though a handful of states have enacted laws regulating “Digital Assets” it is not enough to deal with the rapidly growing issue. Toygar calls for enacting the “Federal Cyber Law Act” in order to regulate “Digital Assets” and other cyber problems within the United States. He believes that this would clear up any ambiguity and unethical practices in this area.
P.K. Downes explains that electronic mail (email) remains the most widely used service on the internet due to its low cost, universatility and ease of use. He goes on to explain the advantages of email over the postal service, telephone call, fax message, and other forms of communication. Email works by saving incoming mail on your mail server by way of your Internet Service Provider (ISP). Email can be accessed by an email program, such as Outlook Express, or an internet browser. If accessed by an internet browser the ISP is replaced by a web-based email account. Downes goes on to explain several features offered by many email providers such as mailing lists, voice over Internet Protocaol (VoIP), Instant Messaging, and Usenet newsgroups.
In “Why the Arpanet Was Built” Stephen Lukasik asserts that the goal of Arpanet was to, “exploit new computer technologies to meet the needs of military command and control against nuclear threats, achieve survivable control of US nuclear forces, and improve military tactical and management decision making.” Lukasik recognizes that even though the capabilities of a packet-switching network had non-defense applications those needs were not central to Arpanet’s decision to pursue networking. A pivotal point in the direction of the ARPA program was in October 1962 when Jack Ruina hired J.C.R. Licklider to run the program. Previously ARPA had been carefully watched by the Secretary of Defense, the White House, and the President’s Science Advisory Committee, but because of a shift in focus in behavioral science “beyond the narrow focus in the department beyond human factors” Licklider was left to his endeavors. Licklider has his own vision of what the command and control problem ARPA was created to solve. He saw it as having two parts, “the machine processing of information and the presenting of that information to humans in a form suitable for use in making decisions.” It was this vision and room to develop that lead to the creation of a decentralized network that was the “technical solution to avoiding decapitation.” This would allow for a network that could survive an attack. Licklider anticipated that the general utility of networking would go far beyond the needs of the Department of Defense, but it was originally conceived as Arpanet for the purposes of command and control during the Cold War.