COTS Journal | Intelligent Systems Source

COTS Journal

 

The editors and staff of COTS Journal are dedicated to providing the industry with the best quality technical material to help readers design and build embedded computers for the military – whether for benign applications or for the most rugged, mission-critical jobs the battlefield, sky or water can offer.

What differentiates COTS Journal from the rest of the pack – aside from its unique posture as the only technology publication addressing the military market – is the quality of its editorial staff. These seasoned veterans of computer, defense and publishing industries not only go the extra mile to provide the most up-to-date information in an easily readable form, but they themselves participate in the industry and are widely sought after to provide technology, market and trend briefings to the industry and government sectors.

Cybersecurity Meets Physical Safety: Shoring-Up Weak Links in Critical Operations

By Deborah Lee James, Former Secretary of the Air Force

Cyber-attacks are a daily occurrence for the US Air Force. In an unfortunate parallel to private industry, Air Force networks are attacked, and defended, thousands of times each week. I know this too well as former Secretary of the Air Force. In my current role, I also am well aware that operators of public as well as private critical networks are forced to pour more time, attention and resources than ever before into computer network security because it is so critical to our nation’s safety and economic vitality.

In light of these costs, and associated high stakes, a clear return on investment is imperative. Such a return is not certain if we train our sights solely on network security. Cyber intruders have shown us repeatedly that many avenues are open to them for launching attacks and compromising critical data. That’s why the Air Force is investing more heavily in operational security, and the private sector cannot afford to fail following suit.

Operational security encompasses the entire portfolio of assets that execute processes, or missions, as directed by software code. These processes might be setting a flight path for an advanced fighter aircraft, or they may direct automated maintenance routines for an HVAC system on a facility where classified operations are conducted.

The Industrial Internet of Things (IIoT), characterized by the vast and complex interconnections of different systems, has opened innumerable gateways to our economy’s many enemies. Today’s cybersecurity for our critical infrastructure — dams, powerplants, industrial complexes — extends barely beyond the network core. Yet surprisingly, and alarmingly, edge devices [...]
By |March 28th, 2018|Articles, COTS Journal|Comments Off on Cybersecurity Meets Physical Safety: Shoring-Up Weak Links in Critical Operations|

Struggling How to Apply Virtual Reality/Augmented Reality to your Training Needs? Three Guidelines to Make It Work and Nail Down the ROI Target

By Raj Raheja, CEO, Heartwood

At times, technology articles overstate the potential that Virtual Reality (VR) and Augmented Reality (AR) offer for training, as low cost headsets and devices are commercially available and relatively easy to deploy.

By now, all services of the military either have deployed or are the in process of developing VR/AR based training exercises.

The U.S. Navy hopes to save $1 billion by incorporating AR technology into their shipbuilding process (through Newport News Shipbuilding). By adding the 3D component of augmented reality to the traditional 2D approach, shipyard workers can quickly understand and perform tasks like placing studs in a bulkhead or steel panel, saving hours per person/task daily.

On the Operations and Maintenance side, virtual reality training is being deployed for Littoral Combat Ship (LCS) crews, providing critical, real-time feedback.

This is just the tip of the iceberg, although there are some lessons to be learned and caveats to keep in mind, as experience is gained.

Many teams want to start deploying (or trying out) VR/AR-based training without accounting for fundamental training needs and lifecycle considerations. This siloed approach will stall and limit project or program ROI.

Here is an effective roadmap that companies can follow from the start, when deploying visual and immersive training solutions:

1. Plan for Training + VR/AR, not VR/AR + Training

VR/AR must be additive in the training lifecycle, not siloed – as a tech innovation available to only those few with the specific hardware on hand. Approximately 80% of the cost involved in creating immersive VR/AR Operations & Maintenance training content can be re-purposed across many platforms – like web, mobile, and laptops. This [...]

By |February 20th, 2018|Articles, COTS Journal, Special Feature|Comments Off on Struggling How to Apply Virtual Reality/Augmented Reality to your Training Needs? Three Guidelines to Make It Work and Nail Down the ROI Target|

Situational Awareness

By Drew Castle, Vice President of Engineering, Chassis Plans

With the constant emergence of new display technologies in the consumer sector, it is important for program managers and engineers in the military markets to be aware of which and how several of these advances can be incorporated into new military programs to achieve maximum benefit. Additionally, as older programs present themselves for technological refreshes, it is imperative to understand the difference in display specifications from previous decades, and how updated technology provides avenues for their evolution and continued progress.

It is no secret that the amount of data available to today’s warfighter is staggering.  With the number of sources for data ever increasing, from new and more advanced Tactical Data Links, improved mapping and terrain data to advanced weather depiction it becomes even more important to have new and better human interface technology so the information can be quickly digested and decisions be made and acted on.

Modern user terminals have high resolution displays with many enhancements for interfacing with this data.  Major trends such as touch screens and enhanced pointing devices, gesture support, biometric authentication and voice recognition are ways to interact with data quickly and securely.

The transition from cathode ray tube displays to early thin film transistor liquid crystal displays occurred rapidly across myriad industries, and the military was no exception. Terminals with multiple display screens, like the three-screen display shown in Figure 1, are seeing transit case deployment in environments where their larger predecessors would never have been taken. The advantages of smaller overall size and decreased power consumption made the transfer an obvious upgrade for a military with a continued focus on rapid tactical capabilities. Current advances in display technology have seen the [...]

By |February 20th, 2018|Articles, COTS Journal, Special Feature|Comments Off on Situational Awareness|

Elements of a Video Management System for Situational Awareness

By Val Chrysostomou, Curtiss-Wright Defense Solutions

There has recently been a proliferation of cameras and sensors on-board ground and airborne platforms for situational awareness applications. This means there is a growing challenge of how best to provide operators with as much usable visual information as possible while ensuring that the data is readable and actionable in real time. Adding to the complexity of the problem is the fact that the operator is typically limited, because of space, weight and power requirements, with a single display screen.

If the operator has to switch between views to access the information to gain good situational awareness, the result can be delays and an incomplete picture that hinder the mission. This includes alternating between different layers of information from numerous different sensor feeds. The video should be presented to the user in a way that helps them meet their objectives as poorly displayed information can cause confusion that can be detrimental to the mission.

The most effective video management systems (VMS) enable a platform’s crew to control their video options – such as sensor inputs, screen configuration, underlay maps and video recording –directly from their touchscreen display. When a crew member’s display also serves as the VMS control center, complete control of surveillance video comes at the touch of a button. The principal advantages of a VMS are simpler integration and maintenance, reduced cost and higher reliability (simplified inter-unit cabling), and flexibility and scalability for platform upgrades.

A VMS is typically characterized by
  • Video streams from multiple sensors or computers
  • Distribution of video streams to multiple displays
  • Flexible display of video – full-screen or quad/picture-in-picture/picture-by-picture
  • Multi-channel recording capability

Depending on the platform, there may be wide variation in how many sensors are supported. [...]

By |February 20th, 2018|Articles, COTS Journal, Special Feature|Comments Off on Elements of a Video Management System for Situational Awareness|

A few keys to succeed with displays…

By John Aldon, PhD, President, MILCOTS

Integrating a display into a subsystem, whether it is a 24in large display for an operator console or a 15in rugged panel PC controlling a gun system, may seem an obvious and straightforward task. The field experience brings a different feedback, and various factors will influence the performance of a video link, regardless of the display being used. We provide a few examples of real situations and emphasize the best way to anticipate problems, save time and frustration to all parties.

Knowing the video sources

As for all topics, a display manufacturer has to deal with various situations when exchanging with a customer on a new project, and aligning expectations can be a challenge if key points are not properly reviewed upfront. Most customers will not spontaneously disclose much about the architecture of the system the display will be embedded in. Even if a display may seem a pretty simpleitem, some of the systems we deal with, such as a weapon control station, a multi function operator console or a large 55” 4K damage control panel, involve many customer controlled subassemblies that may lean on legacy obsolete video sources. Assuming that the video feed provided by the customer always meets today’s standards is like shooting in the dark: that may work…. But it can also lead to a tremendous amount of time spent afterwards when for whatever reasons, the final performance of the video link falls short of expectations. A recent representative example worth noting was the request for an HD-SDI port as an alternate video input on a 17” display, without mentioning that the video feed was delivering a 30 Hz signal. The video controller planned for this project was [...]

By |February 20th, 2018|Articles, COTS Journal, Special Feature|Comments Off on A few keys to succeed with displays…|

How Artificial Intelligence (AI) and Deep Learning Impact the Future

How Artificial Intelligence (AI) and Deep Learning Impact the Future

An Interview of Dr. Maya Dillon, Head of Data Science, Global Centers of Expertise at Luxoft, a global technology consulting company focused on business transformation solutions for Fortune 500 companies.

By John W. Koon, Editor-in-Chief

Dr. Dillon provides support for data science solutions across Luxoft’s Lines of Business. She helps clients make sense of large, disparate data sources to extract real actionable insight. As a result, she helps businesses deliver new, differentiated products, driving competitive advantage. Dillon is a member of the Tech London Advocates organization and a supporter of The Royal Astronomical Society. She received her Doctorate in Astrophysics from the University of Warwick.

1. Can you provide an overview of what Artificial Intelligence (AI) and Deep Learning and how it will impact the future development of technologies? Please include an example or two.

There are two particular arenas of AI that are fascinating to me:

Healthcare: Diagnosis and Treatment of Diseases:

AI is now capable of diagnosing diseases with greater accuracy than human doctors by taking into account a larger number of factors. The use of AI in subsequent treatment is also compelling, particularly in the case of cancer. AI now supports everything from the identification of tumors, to implementing therapy, to aiding the excision of masses. Such methods are vastly improving the efficacy of current treatment plans, and consequently improve the quality and longevity of patients’ lives.

Automotive: Self-Driving Vehicles:

The key to a successful self-driving vehicle is ensuring the AI controlling the vehicle is constantly aware of the events occurring around it, in order to recommend or enforce a course of action.

However, [...]

By |October 15th, 2017|Articles, Editors Report, MEDS, RTC Magazine, Special Feature, Technology In Context|Comments Off on How Artificial Intelligence (AI) and Deep Learning Impact the Future|

The Many Flavors of Low-power Wide Area Network (LPWAN or LPWA)

The Many Flavors of Low-power Wide Area Network (LPWAN or LPWA)

The impending shut down of 2G networks creates confusion for organizations with varied industrial connectivity needs and the movement has created a window of opportunity for unlicensed low-powered wide area networking (LPWA) solutions like LoRaWAN, SigFox and others. “The Many Flavors of LPWA” takes a closer look at these new LPWA options, addressing their varied features and benefits to help organizations in deciding which one best meets their unique and varied requirements.

By Derek Wallace, MultiTech

The options for industrial connectivity are broad and growing, including analog, Ethernet, cellular, satellite, Bluetooth, Wi-Fi and the up-and-coming Low Power Wide Area (LPWA) technologies, which seek to address key limitations of the others in order to better enable the growing Internet of Things, specifically: range, cost and battery life. While cellular operators are voluntarily shutting down the earliest 2G networks and driving M2M/IoT customers to not only upgrade their physical devices, but also purchase bandwidth beyond what is generally needed for M2M and Industrial IoT applications – 75% of which use less than one megabyte per month of data, according to James Brehm & Associates. The global carrier community is looking to variants of LTE and even forward years to 5G to address this disconnect. Unfortunately, from a practical perspective, these alternatives (LTE Category M and Narrow Band IoT [5G]) are still on the horizon in terms of immediate adoptability. This timing disconnect has created a window of opportunity for unlicensed LPWA networking solutions like LoRaWAN, SigFox and others. These solutions can run for years on batteries and operate in locations other technologies simply don’t reach. Plus, because they operate on unlicensed spectrum, they deliver device connectivity at a fraction [...]

By |October 15th, 2017|Articles, RTC Magazine, System Development, Technology Connected, Technology focus, Technology In Context|Comments Off on The Many Flavors of Low-power Wide Area Network (LPWAN or LPWA)|

COTS in Space – a Guide for Managers

 

The motivation (inspiration) to write this article came from recent discussions with traditional military and aerospace personnel and, to some extent, with commercial companies interested in flying commercially-available hardware in space. Its purpose is to shed some light on the reasoning behind the selection of hardware for space application with emphasis on COTS. Although the basic question these companies asked centered on whether a military grade single board computer (SBC) could be used for space missions, it was asked in different ways.

George Romaniuk, Director, Space Product Management, Aitech Defense Systems

What would it take to make my SBC space qualified?

Let’s start with some clarification of commonly used phrases like “Space Grade” or “Space Qualified” hardware. It’s unfortunate that these phrases are oftentimes used universally, without thinking about all the issues and considerations associated with the application of hardware for a specific space mission. (Figure 1).

Why are we talking about “Space Grade” parts to begin with? Sending hardware to space was—and is still—very expensive, so the objective is to maximize the mission’s success and lifetime. In order to achieve this, we need to look at the reliability of the parts we would like to use for the mission. (Note, we do not address the effects of radiation here, as it’s addressed later in the article.)

The traditional approach to obtain Electrical, Electronic and Electromechanical (EEE) parts with desired reliability was based on parts that were well-designed both electrically and mechanically as well as manufactured using the same process and materials in quality controlled production lots. These parts were later subject to a screening process intended to identify and remove those few parts that exhibited infant mortality failures.

An important aspect of the screening process is the [...]

By |September 20th, 2017|Articles, COTS Journal, Special Feature|Comments Off on COTS in Space – a Guide for Managers|

Space Marvels and the NASA Connection

 

Beyond Eclipse

2017 is a very special year. We were able to witness the total eclipse of the sun. Figure 1. The next total solar eclipse in the USA will be April 8, 2024. For those who are fortunate enough to be able to view this spectacular phenomenon, they would tell you it is an experience of a lifetime. But have you looked at the sun from the space. Rob Gartner of NASA provided us with this.

“A ground-based image of the total solar eclipse on Aug. 21, 2017 (gray, middle ring), is superimposed over an image of the Sun’s atmosphere, called the corona (red, outermost ring), as seen by ESA (the European Space Agency) and NASA’s Solar and Heliospheric Observatory (SOHO), which watches the Sun from space. At center is an image of the sun’s surface as seen by NASA’s Solar Dynamics Observatory in extreme ultraviolet wavelengths of light.

During a total solar eclipse, ground-based telescopes can observe the lowest part of the solar corona in a way that can’t be done at any other time, as the dim corona is normally obscured by the bright light of the Sun. The structure in the ground-based corona image — defined by giant magnetic fields sweeping out from the Sun’s surface — can clearly be seen extending into the outer image from the space-based telescope. The more scientists understand about the lower corona, the more they can understand what causes the constant outward stream of material called the solar wind, as well as occasional giant eruptions called coronal mass ejections.” https://www.nasa.gov/image-feature/goddard/2017/aug-21-solar-eclipse-fromground-and-space. Figure 2 is a view of the sun from [...]

By |September 20th, 2017|COTS Journal, Special Feature|Comments Off on Space Marvels and the NASA Connection|

The Future is Now – Artificial Intelligence and Augmented Reality are Emerging to Dominate New Systems and Capabilities

 

The Future is Now – Artificial Intelligence and Augmented Reality are Emerging to Dominate New Systems and Capabilities

As Yogi Berra once observed, “making predictions is difficult, particularly if it involves the future.” But in this case, the future is now.

Consider the following scenario:
The new and enhanced attack vehicle has just sustained an intense firefight. With the enemy in retreat the commander asks the system itself for a damage report. In a Siri-like voice, it responds that there was limited damage but that the vehicle can return to base but at a lower speed. The commander can see from the main display a 360 degree image with a virtual display of enemy resources ( from drone communications) superimposed. Central command also receives the images and directs the drone to oversee the egress with its missiles. Base maintenance crews have been alerted to the damage and have the replacement parts waiting. On return the vehicle system will guide maintenance to the location of the damage and illustrate how to reach it. The commander appears as a virtual image within actual Central Command and is debriefed while he is still in the field.

The implications for space applications are staggering. Interactions between home based and space based activities – particularly for unmanned missions where the likelihood of experiencing unplanned events – are best developed by using AI (wherein the unmanned system learns from home based interactions) and with AR interactions wherein home based participants can experience the actual space experience.

What I mean by AI learning might be explained by the following example. Have two different Google users Google the same question on their respective computers. What the search turns up will be different and each will be based on past inquiries.

Does this [...]

By |September 20th, 2017|Articles, COTS Journal, Editorial|Comments Off on The Future is Now – Artificial Intelligence and Augmented Reality are Emerging to Dominate New Systems and Capabilities|