July 2017

IEEE Internet Policy Newsletter - July 2017

ISSUE 6 | JULY 2017

Health Informatics Standardization and the e-Health Sector: Part 2

By Tahir Hameed, Ph.D.
SolBridge International School of Business, Daejeon, South Korea

The first of this series released in our previous publication (May 2017) suggested that a delay in health informatics standardization could have led to the delayed development and adoption of e-health systems hence inefficiencies of the healthcare sector. This part attempts to make further sense by looking deeper into Healthcare Informatics Standards (HIS) in the U.S. over past decades.  

An evolutionary view of technology standardization, mapping Tassey’s (2000) categories of technology standards across product/technology cycle, suggested the timing of development, promotion and adoption of different categories of technology standards would be associated pre-dominantly with the evolving technology and market needs. In the early phase of the technology industry, Information Standards (IS) (ontologies, test, measurement and evaluation standards) lay the foundation of R&D and product development. Following that, Inter-Operability Standards (IOS) establish a level-playing field for fair competition between the product-systems under development. Once a dominant design – the architecture of the winning product-system – is accepted by the market and industry shifts its focus to economies of scale (production) through much needed Variety-Reducing Standards (VRS) (promoting both ontological and semantic inter-operability). Finally, Quality and Reliability Standards (QRS) provide the basis for operational efficiencies, customer service levels and market expansion through improvements in relevance and performance of product-systems and services. It might not be an indispensable order of standards development but it makes sense according to the timely requirements of an industry. 


The Many Pieces of the Multistakeholder Puzzle

By Daniel Fink, Ph.D., ICANN, Sao Paulo, Brazil and Mark W. Datysgeld, Researcher at the São Paulo State University (UNESP)

Following the end of the Cold War, a series of discussions were carried out, particularly within the United Nations, that pointed towards a more cooperative future[1]. Global politics would no longer be determined solely by states and a few powerful companies. The notion of global governance emerged and blossomed between 1992 and 1995, as more and more attention was given to the subject.

However, some experimental intergovernmental initiatives involving non-state actors have been in existence for decades. One example of such an initiative is the International Telecommunications Satellite Organization (INTELSAT), which was founded in 1964, as an amalgamation of state and private companies. Its  primary purpose was to coordinate satellite operations, which paved way for the communication revolution of the twentieth century[2].


Persistent Protection of Data

By Jay Wack, President
Tecsec, Inc. USA

The use of encryption has evolved into broader use cases during the past years. With the advent of the internet, in addition to concerns for intercept (otherwise referred as Data in Transmission), a new security paradigm has surfaced that is referred to as Data in Storage (or at rest). In addition to protecting the information, per se, encryption has been coupled to access control designs that have encryption enforcing who has access to specific information, locations, functions, and often even who can access the transport layer.

There are other layered variations for encrypting data, but network and content offer the best examples. The two methodologies for implementing encryption have major differences. A Network solution with encryption results in a secure channel that information can pass through and can be viewed as an encrypted pipe for information; whereas, a content solution has encryption bound to the information content, so that each piece of content is encrypted separately. Content encryption can also be thought of as a persistent protection through encryption resulting with a binding of the encryption to the content or to a message throughout the life cycle of the message.


The Role of Forensics in the Internet of Things: Motivations and Requirements

By Suleman Khan, Ph.D.
Lecturer at School of Information Technology, Monash University Malaysia

The Internet of Things (IoT) is an emerging technology which leads human life to interact with billions of devices of the world[1]. This provides unsurpassed convenience in the human life. However, the open nature of interaction between IoT devices makes way for intruders to exploit the data transferring among different devices[2]. To secure each and every device in the IoT paradigm with integrity of its data is an utmost challenge to IoT community[3]. Currently, the IoT community is starting to  think about IoT security in terms of developing embedded security solutions, middleware, cloud security, and much more. However, all these efforts count towards detection and prevention of security attacks. The recent efforts lack to investigate the source of  the attack that cause problems for IoT paradigms. Therefore, the IoT paradigm requires forensic solutions to find the root cause and to minimize different attacks[4].  

There is no single solution available to protect the entire IoT infrastructure from different security attacks. There is always an opportunity for an attacker to bypass security barriers because of rapid enhancement in the technologies, open source market, numerous applications, and various others.