Search here:

The key ingredient in your recruitment

Blogs & News

Know Now: The Need for Improved Field Inventory Visibility of Medical Devices

 Read More

As of 2018, the medical device market has grown to a $423.8 billion dollar industry and is expected to grow to $521.64 billion by 2022.1 With over 6,500 medical device companies and a growing amount of inventory being transported in the field on a daily basis, gaining real-time visibility of inventory is necessary. Mounting regulatory standards, demand, costs, and inefficiencies are all driving the industry toward a much-needed change for inventory management, specifically field inventory.

While many solutions exist for managing critical supplies, a fully connected, smart inventory management system presents the most proactive solution for automating inventory management. Connecting assets using IoT technology enables medical device manufacturers and hospitals alike real-time information throughout the supply chain, thus better meeting customer demands, improving patient safety, meeting regulatory compliance standards, and accurately monitoring temperature and expiration dates. All resulting in a much better outcome for suppliers and providers.

The Current Inventory Landscape Is Not Sustainable

The term “medical devices” covers a wide variety of items, from a simple pair of sutures or a wheelchair to pacemakers and vascular grafts. All of which are essential to providing uncompromised patient care for healthcare organizations across the U.S. Be it a small rural clinic or a nine-hundred-bed hospital, medical devices are a major part of the healthcare industry. The U.S. Department of Commerce estimates that the US medical device market has grown to $423.8 billion. Currently, there are over 6,500 medical device companies that supply innovative products and services to over 6,000 healthcare provider locations in the U.S. Through the growth and evolution of this significant market over the past 50 years, the importance of having the right instrument or implant at the right place at the right time is increasingly critical. Unlike the retail sector where a stock-out usually means a lost or delayed sale, a stock-out within the health industry results in a much worse outcome, and must be avoided at all costs, as this puts patient safety at risk and seriously jeopardizes the relationship between the healthcare provider and supplier.

To prevent this, medical device manufacturers have developed numerous, often complex, supply chain and distribution models. These traditional models often rely on the sales representative or inventory specialist delivering the product to the hospital on the final mile of its journey. The result of this intricate network to get a product to the right place at the right time results in a continuous buildup of inventory in the field.

Field inventory represents the process of inventory from the point of manufacture or main distribution center to the point-of-use. Too often, field inventory ends up in forward stocking locations (i.e. trunks, homes, on consignment at hospitals, or in-transit between any one of those locations). In some cases field inventory is cycled quickly, arriving to the final destination via a sales representative for a specific procedure then sent back to a central distribution center immediately after. In other cases an implant can sit as consigned inventory at a hospital for years.

To date, the challenge for field inventory management has been visibility. In the majority of instances, once product leaves the manufacturing facility or main distribution center its location and status can only be determined by manual efforts.

Traditionally this involves customer service calls to a sales representative inquiring about a particular piece of inventory. Rather than view inventory in real-time as it moves through the supply chain, medical device companies scramble to avoid stock-outs and reconcile actual demand from current inventory, while also ensuring they are in compliance with increasing regulations.

An independent healthcare industry consultant recently studied the inventory patterns of 49 publicly traded manufacturers of medical devices and instrument products. The median number of days in inventory was 137, meaning that a great deal of high-valued inventory is floating out in the field for half a year or more on average. At a time when there is increasing pressure on margins and efficiencies, the glut of inventory in the field emphasizes the problem. The highest number was 400 days, meaning that the manufacturer has more than one year of inventory out in the field, a likely red flag for financial auditors and investors alike. Not to mention, this increases the threat of expired product circulating in the field with the potential to be used in a procedure.

If a company with $1 billion in sales cannot account for nearly half of its field inventory, auditors will want proof that the inventory is real and will likely require monthly sales audits, resulting in additional time and expenses.

The old adage “if it’s not broken, don’t fix it” has applied to the medical device industry for years. However, the research and current landscape show that the way inventory is managed is, in fact, broken. Approaching this problem in a proactive way is the best solution to a market that has continued to operate in a manual, reactive manner. The growing levels of field inventory have created serious challenges for stakeholders in the healthcare community, including:

Source: Terso Solutions

£4.9m hub designed by Lancaster University set to promote innovation across Lancashire

Chemical research at Lancaster University.

 Read More

A new business-support hub aims to spearhead innovation for Lancashire businesses that produce or use chemicals.

Co-designed between Lancaster University and Yordas Group, a provider of scientific and regulatory consultancy services, the £4.9m NextGenChem programme will work with 300 Small and Medium Enterprises (SMEs) to boost growth in chemical-using industries – such as aerospace, automotive, energy, applied healthcare, and life sciences sectors.

The three-year programme, which is part-funded by the European Regional Development Fund, will lead to the creation of new and improved products and processes with support delivered through a blend of targeted workshop activities focused around innovative synthesis, formulation, chemical process development, and bespoke technical and analytical research and development support.

“Together, Lancaster University and Yordas Group present a complimentary range of expertise in various scientific areas with a unique industrial insight”, said Dr Safyan Khan,  NextGenChem Project Manager with Lancaster University.

Source:  The Visitor 

Pulse Check: Assessing the Global Wearable Medical Devices Market

The global wearable medical devices market wasn’t immune from COVID-19 supply-chain disruptions, clinical trial delays, and sales losses. However, even a devastating pandemic didn’t stop the wearable medical device market’s steep ascent in revenue.

 Read More

As demand for telehealth and remote patient monitoring skyrocketed and regulatory and reimbursement barriers eased, investors gained confidence in these technologies. Digital health overall set funding records during Q2 2020—a time when the U.S. GDP dropped by a whopping 32.9%.

According to a report from research firm Mercom Capital Group, digital health received $6.3 billion in funding in the first half of 2020, up 24% from 2019. Of that amount, $794 million went to mHealth apps and $321 million went to wearable sensors.

For the 12-month period ending June 2019, M&A deals in the Asia-Pacific region rose in volume by 239% to 61 deals according to an EY report. That’s despite total deal value slipping 17% to just under $4 billion.

The success story of the wearable medical devices market is part of a larger tale—the remarkable growth of the wearable technology market overall. Factoring in consumer-grade activity trackers, wireless headphones, and “smart” clothing, among other devices, the wearables market more than doubled in five years—from about $26 billion in 2014 to $67 billion in 2019 according to technology research and consulting firm IDTechEx

James Hayward, principal technology analyst at IDTechEx, said much of that growth stems from the release of the Apple Watch in 2015 and the AirPods in 2016. “The 2018 to 2019 spike in growth was mainly correlated with true wireless stereo headphones, and particularly, AirPods, which were key in driving growth in the hearables sector,” he says. “Similarly, we saw strong growth from various wearable medical device fields.”

Wearable medical devices used to treat diabetes, heart conditions, and hearing issues experienced the strongest growth from all wearable medical device sectors. Population health initiatives and, unfortunately, COVID-19 have bolstered sales of devices used in remote patient monitoring.

The pandemic also gave telemedicine, remote patient monitoring tools, and digital health overall an incredible boost in usage—one that likely won’t go away. CMS’s loosening of restrictions, as well as FDA’s interest in expanding digital health availability, will add further fuel to wearable medical device growth.

“Regulatory hurdles are a big concern for early stage investors in the length of time before they’ll be able to get a return on their money,” said Glenn Snyder, medical technology segment leader for Deloitte. “It would be an encouragement to established companies as well.”

The shift in healthcare from episodic, fee-for-service care to preventative, value-based care benefits wearables because of their wellness-focused nature. A related shift in power from provider to patient creates additional opportunity for wearable medical device makers. 

wearable medical devices

Image compiled by Amanda Pedersen


For a wearable device to truly succeed, trust needs to be established and developed among all stakeholders. “The patients need to trust their information is secure, and the physicians need to trust the information is clinically accurate and effective,” said EY Global Medtech Leader James Welch. “The regulatory agencies need to trust that both these things are happening, and the healthcare systems and payers need to trust that the data can be analyzed in a way that leads to better outcomes to potentially lower the cost of care.”

Trust seems to be building with wearable medical device users, at least. According to a 2018 report from Accenture, most consumers surveyed were willing to share data with healthcare providers (88%) and health insurers (72%). A large percentage agreed that wearable technology helps with understanding their health condition (75%), help them stay engaged with their health (73%), and help monitor the health of a loved one (73%). More than half (69%) believed wearables help improve overall quality of care and patient-physician communication.

The arrival of 5G, the emergence of virtual and extended reality, and increased competition (or cooperation) from consumer tech companies will also significantly impact the wearable medical device market. Add it all up, and you’ve got one exciting time for wearable medical devices.

The Hybrids


Smart Watches Get Smarter

While wearables have been part of the medical device industry for decades (think hearing aids, Holter monitors), they’ve become gradually lighter, more comfortable, and “smarter.” As larger consumer technology companies started dabbling in healthcare technology, the line between consumer and medical wearable blurred. 

For example, the Apple Watch, which initially resembled a wrist-based, circa 2015 iPhone, now includes FDA-regulated electrocardiogram (EKG) and irregular rhythm notification. FitBit added a blood oxygen monitoring feature to some of its smartwatches and sought FDA clearance for sleep apnea diagnosis. It also partnered with Bristol-Myers Squibb and Pfizer last year help detect and diagnose atrial fibrillation (FDA clearance pending).

Garmin, a leader in fitness watches, plans to work with ActiGraph to develop wearables for clinical trials. It also plans to integrate its data with Medtronic’s remote patient monitoring mobile app. 

Adding regulated functions to a fitness/wellness product allows for interesting product marketing claims and higher sales prices; however, can healthcare providers actually use this information to improve patient care? 

“It’s all about the data,” said Welch. “The data need to be accurate and reliable, transmitted safely to a clinical environment, and accessible so it can be analyzed to make clinical decisions.”

Wearable Sleep Monitoring Devices

Wearable sleep devices, which includes primarily consumer-grade sleep trackers, are also on a growth track. Increasing awareness of sleep disorders and the health benefits of sound sleep have fueled their growth.

The sleep tracking market is projected to reach $7 billion by 2026, according to Research and Markets. Withings, FitBit, Polar, Garmin, and Apple all offer sleep tracking features on their devices. The Oura Ring tracks similar data, but you don’t have to wear a watch to bed.

EverSleep measures blood oxygen and other data while you sleep and analyzes that data to detect sleep disturbances. It’s not FDA cleared. Withings and FitBit are in the midst of FDA clearance and/or CE marks for sleep apnea detection. Itamar Medical, however, received FDA 510(k) clearance in June 2019 for a disposable wearable used to test for obstructive sleep apnea.

The Leaders in Wearable Medical Devices


Continuous Glucose Monitors  

Outside of watches, continuous glucose monitors (CGMs) were the top performer in 2019. IDTechEx reports the CGM market grew by $1.3 billion from 2018 to 2019. 

Allied Market Research estimates the CGM systems market at about $1.8 billion in 2019 and projects it to reach about $8.8 billion by 2027. Use of CGMs has steadily increased: The number of people with diabetes using CGMs increased from 3% in 2006 to 38% in 2017. 

As accuracy, ease of use, and comfort improve and costs come down, CGM use will likely keep climbing. Unfortunately, the number of people diagnosed with insulin-requiring diabetes also keeps rising-from 415 million globally in 2015 to 642 billion in 2040 according to the International Diabetes Federation-which will also drive further adoption.

AbbottDexcom, and Medtronic dominate the market, while India, China, and Australia will occupy a greater percentage of market growth over the next seven years. 

Cardiovascular Devices

Traditional and extended-wear Holter monitors remain strong sellers; however, the market is diversifying with the rise of mobile cardiac telemetry devices used in remote patient monitoring. The wearable cardiac device market is expected to grow to $6.2 billion by 2026, a CAGR of 22%, according to Global Market Insights.

Disruptors such as KardiaMobile, from AliveCor, will give traditional cardiac wearables competition. KardiaMobile is an FDA-cleared personal EKG monitor that can detect afib, Bradycardia, and Tachycardia. The device attaches to smartphones and tablets: users put their fingers on the pads, and the device issues a reading in about 30 seconds.

Hayward says this is an important time for wearable cardiovascular devices in the U.S. Changes in the reimbursement structure for some key wearable cardiovascular monitoring devices will see them reclassified under a Category I (permanent) CPT code, with changes taking effect in 2021.

“That gives developers the security that they can build a business model around these devices,” said Hayward. “When that code moves, it will shift the landscape. That’s when I think some larger players will be more confident to expand their efforts in this area.”


Hearing aids, one of the original wearable medical devices, continue to experience steady growth. Although their core purpose hasn’t changed, the devices themselves have improved dramatically over the past 10 years. Today, we have rechargeable hearing aids (no more tiny batteries) and hearing aids with smartphone connectivity. Users can connect to an app to adjust the wearable device to surroundings or to talk on the phone.

Like cardiac devices, hearing aids will also undergo a significant shift over the next year. FDA was charged with issuing a proposed rule, followed by a final rule six months later, to create an over-the-counter hearing aid category for adults with mild-to-moderate hearing loss. Expect some new entrants into the hearing aid market over the next few years.

The Newcomers in Wearable Medical Devices


Digital Therapeutics

Pharmaceutical companies are well aware of digital disruption. Many companies now have digital health components, either through combination products, companion apps, or some other unique offering. 

Delivery of therapeutic interventions to patients using “evidence-based, clinically evaluated” software is what’s called a digital therapeutic. Because the segment is relatively new, accurate market growth estimates are not available.

Wearable technology comes into play for drug administration. To make administration easier for the patient or to administer therapies that aren’t compatible with self-injection or auto-injectors, drug manufacturers can turn to a wearable device. 

Wearables are especially useful for biologics, which typically require high doses and/or subcutaneous administration. Digitally enabled wearables can collect and store data to be uploaded to an EHR or to monitor and track disease symptoms.

Medication Adherence Wearables

Medication adherence is a complex global problem that leads to poorer health outcomes and additional treatment costs. These avoidable hospitalizations and doctor visits add up to $100 to $290 billion annually for the U.S. healthcare system.

A study conducted by San Mateo, CA-based Evidation Health evaluated 8,500 people with chronic disease. Researchers found people using activity trackers were more compliant with medication adherence than those who didn’t use the devices. 

Other useful devices track the medication itself. Amiko’s Respiro, which has a CE Mark for use with three specific inhalers, combines a sensor that attaches to those inhalers with a mobile app to track medication use. The device helps with medication adherence as well as tracks disease progression. Smart pill bottles and necklaces are also under discussion.

Barriers and Needle-Movers


The COVID-19 Effect

If it weren’t for the SARS-CoV-2 pandemic, the wearable medical device market outlook for the remainder of 2020 and for 2021 would look much different. “We’ll look back and see that COVID-19 was a huge accelerator for all these markets,” said EY’s Welch. “The rapid acceptance of not only remote patient interaction and remote patient monitoring but all aspects of the market is going to create exponential growth.”

Most of the barriers to telehealth reimbursement that fell in March 2020 will stay down according to CMS. With more people choosing to receive care remotely, there’s increased opportunity for wearables that monitor vitals and other symptoms remotely. That opportunity lies primarily with regulated wearable medical devices.

“These devices are there to keep healthcare workers safe,” said Welch. “These devices are improving, and COVID-19 has helped device manufacturers learn how to better connect with patients and share data more effectively.” The remote patient monitoring device market overall (wired and wireless) is expected hit a CAGR of nearly 15% between 2020 and 2027, according to Research and Markets.

In addition to remote patient monitoring wearables from major medical device manufacturers, new entrants have emerged with products related to COVID-19. iWEECARE’s TempPal and Blue Spark Technologies’s TempTraq are two wearable thermometers that measure temperature continuously. Empatica and the Biomedical Advanced Research and Development Authority are moving Aura, a wearable wristband that detects likely SARS-CoV-2 infection, through the FDA pipeline.

The Impact of 5G

5G has the potential to transform healthcare. It promises to remove latency, allowing for remote robotic surgery, faster image transmission, improved the quality and speed of virtual visits, and faster data transmission for remote patient monitoring tools. 

With 5G comes the opportunity for new levels of interconnectivity and real-time control, and wearables play a starring role. As issues around IT infrastructure and security resolve-more data = more risk-5G could promote another wearable medical devices market spike.

Threats and Challenges in Wearable Medical Devices Market

For all the current excitement and future potential for medical wearable devices, barriers to more widespread adoption remain. Interoperability—an issue that has plagued the healthcare sector for years—remains a challenge for wearables and the data they produce. Questions to address: Who owns the data? Do the healthcare systems have the interfaces to gather, store, and share the data? Can healthcare providers access the data in a way that’s useful?

According to Deloitte in the report, Medtech and the Internet of Medical Things: How connected medical devices are transforming health care

, “Interoperability is arguably the biggest challenge to health care’s ambition for a patient-centred, digitally-enabled, health care ecosystem. If the challenge is to be addressed, open platforms, based on open data standards is the direction of travel that needs to be followed to enable payers, providers, and technology vendors to finally come together to make data more available to one another.” 


With advances in materials sciences and embedded systems, sensors have become smaller, more flexible and more accurate. However, issues around data accuracy, especially in consumer-grade devices, hamper more widespread clinical use. A physician would likely have no need for data from a sleep-tracking app; however, he may authorize use of a device designed to diagnose sleep apnea. 

According to a review article from biomedical engineers out of Portugal, questions also remain around long-term stability, resiliency, and compatibility. Textile-based sensors may degrade after repeated washings, for example.


Acceptance of wearable medical devices by both patients and physicians remains an obstacle, although COVID-19 may have lowered some of this resistance. Snyder said Deloitte found higher adoption in younger and older populations: the health-conscious Gen Z and millennials and motivated adults over 65 with chronic health conditions. 

“Both these groups tend to be proactive and use health-oriented technologies to either stay well or to manage their conditions,” he said. “One of the challenges is tapping into some of the other segments, and doing it in a way that that motivates them. This is where addressing reimbursement issues becomes critical.”

Regulatory Hurdles

Traditional medical device manufacturers are familiar with the arduous process of taking a device from design to regulatory approval. Software doesn’t fit this model. FDA has advanced its digital health initiatives to better accommodate agile product design, but it remains, to date, an expensive, time-consuming process. 

For some software developers, it may not be worth the hassle. “It’s still quite a few years—or indefinitely—before we can expect truly formal medical launches from major consumer brands,” said Hayward. “Choosing whether or not to seek formal medical device approvals is a complicated strategic decision. They can quite comfortably invest the money, but it’s not part of their immediate product strategy.”

Looking Ahead

With technology’s tendency to disrupt entire sectors (retail, finance, transportation), some barriers could resolve over time or get steamrolled tomorrow. Regardless, the road ahead looks prosperous for the developing wearable medical devices market segment of the medical device industry.

“Regulatory changes coming in late 2020 should lead to positive steps for the market,” said Hayward. “It means disruption, and sometimes disruption means commoditization. But generally, it means the options will open up to many more people.”

“In the next three to five years we’re going to be able to not only capture clinical baseline indicators remotely, but potentially track them remotely,” said Welch. “And then we’ll be able to use that data in the course of either establishing diagnosis or to guide therapy for an existing diagnosis. There’s still a tremendous amount of growth opportunity in this area, and positive things that can come from those opportunities.”

Source:  MD+DI

Sustainable cleaning solutions for medical electronics reliability


 Read More

The role of electronic medical devices is rapidly changing as more advanced technology is designed and developed. This shift is transforming healthcare particularly in areas like diagnostics and remote patient monitoring as medical management moves outside the hospital setting and into the home. This trend is likely to continue as ever more patients are being remotely diagnosed, treated and monitored due to the COVID-19 pandemic.

More investments are being made in innovative, electronic technology like handheld ultrasound systems and digital stethoscopes as well as pioneering remote patient tracking equipment. An increase in these technological advancements is driving the growth of the medical electronics market. It was valued at $3.01 billion in 2015 and is expected to reach $4.41 billion by 2022.

This accelerating growth and scientific advancement calls for maintainable and reliable manufacturing, with the assurance that devices will work effectively and consistently. A recognised method to help guarantee reliability is through quality cleaning. Improved cleaning directly translates to more effective printed circuit boards (PCBs), and therefore to better medical electronics.

What is important, is finding a cleaning solution that lasts the distance in terms of sustainability, reliability and cost-effectiveness. Almost all medical devices require cleaning during manufacture to remove particulate, flux, oils or inorganic contamination resulting from the manufacturing process. The smallest contaminant can form a barrier between electrical contacts. Dirty printed circuit board assemblies (PCBAs) are susceptible to a whole host of problems ranging from electrochemical migration and delamination to parasitic leakage, dendrite growth and shorting. It is this reason why cleaning is crucial to ensuring the consistency of an electronic medical device.

The challenge is to identify a process that is suitable for the critical cleaning of complex assemblies, intricate shapes and delicate parts, and also meet evolving environmental and worker safety regulations.

Regulation rules

Electronic components used within medical devices require not just long-term functionality, but they must also stand up to rigorous regulations put in place by governing bodies.

Medical electronics often include dense circuitry within very small, complex packages. These devices have to pass extremely stringent tests and be free from contamination to be used inside the body. A prime example of this is implantable devices like pacemakers, defibrillators and cochlear implants. Reliability is non-negotiable as it can come down to a matter of life or death if the device was to fail. It is not only necessary for implantable electronics, but also for external devices where accuracy is essential to diagnosis.

There are several different regulations to adhere to, for example those stipulated by the Food and Drug Administration (FDA) which look at points like toxicity and sterility, or International Standards Organization (ISO) which identifies quality management processes.

Another example is the benchmark standard IEC 60601-1. Specified by the International Electrotechnical Commission (IEC) this regulation is explicitly designed for medical electrical equipment and systems. It necessitates that the basic safety and essential performance of the medical device be maintained. Cleaning is one of the central practices to help to meet this requirement.

Not only is cleaning required to meet regulatory and reliability requirements, but the fluids used for cleaning also have guidelines to follow. Some of the biggest challenges when cleaning medical electronic devices include environmental and worker safety concerns. There are very specific sustainability regulations that must be met. An obvious example is the Montreal Protocol, an international agreement that regulates the production and consumption of man-made chemicals which contain ozone depleting substances (ODS).

This protocol came into effect in 1987 listing nearly 100 ODS that threatened human health by lessening the earth’s protective stratospheric ozone layer. Some ODS are strictly controlled for their use, others are targeted for phasedown, with some completely phased out.

The Montreal Protocol’s Scientific Assessment Panel estimates that with implementation of the treaty we can expect near complete recovery of the ozone layer by the middle of the 21st century. This highlights its significance and why it is important to find and use sustainable cleaning methods.

Sustainable cleaning

The processes and cleaning fluids used in the development and manufacture of medical electronics must be precise and consistent. However, it is not enough for companies to just produce high quality parts. They must also minimise negative environmental impact, conserve energy and protect natural resources all while ensuring the health and well-being of their employees

Vapour degreasing, when used with advanced cleaning fluids, is playing an important role in meeting this sustainability challenge. It is being more widely utilised within the medical electronics sector because it complies with the rising number of environmental laws regulating cleaning fluid use and worker safety.

The Montreal Protocol is just one of a growing number of regulatory conditions that are working to reduce any negative impact to the planet and its people. Cleaning fluids that were once used to clean electronics parts are now being phased out or have been banned completely. In the US, the Environmental Protection Agency (EPA) has added TCE, Perc and nPB to the watch list. The EU’s rules of Registration, Evaluation, Authorisation and Restriction of Chemicals (REACH) currently classifies TCE as a carcinogen with nPB soon needing special permission for use in Europe after July 2020. Furthermore, Canada and Japan are severely restricting, or have banned, these solvents altogether. For this reason it is prudent to seek the most sustainable long-term cleaning solutions.

Modern cleaning fluids used in a vapour degreaser have outstanding ecological and safety credentials. They are non-toxic, environmentally-friendly and highly effective cleaning options that perform better than older hazardous solvents.

Highly advanced cleaning fluids have been developed to replace harmful chemicals like HCFC-225 or nPB and TCE, which have air and ground water quality concerns. The new fluids are not only effective at thoroughly cleaning components, but they stand up to the regulatory requirements of the medical industry and international governing bodies.

These new progressive cleaning fluids benefit from being ozone-safe and meet standards like those required by the EPA or REACH. Additionally, they are approved under the U.S. Significant New Alternatives Program (SNAP) and Toxic Substance Control Act (TSCA). They feature a minimal Ozone Deleting Potential (ODP) and a low Global Warming Potential (GWP).

What is important is that they offer improved environmental properties without compromising on performance.

Working safely

When looking for sustainable long-term cleaning strategies consideration must also be given to protecting workers’ health and safety. The PEL (Permissible Exposure Limit) or OSHA-designated time limit that workers should be exposed to a chemical is much better for sustainable cleaning fluids than older legacy solvents. Typical permissible exposure levels for sustainable fluids are 200-250 ppm. Compared with TCE which has a 100-ppm PEL or nPB which is rated at just 0.1 ppm by the American Conference of Governmental Industrial Hygienists. Modern cleaning fluids are considerably better for the safety of exposed workers and help in preserving air quality, a significant benefit for workers.

In addition to protecting air quality by producing fewer emissions, sustainable cleaning fluids offer a further safety advantage thanks to it being non-flammable. This is important in lowering the environmental health and safety risk and protecting workers from accidents.

Cleaning without compromise

There is no compromise when it comes to cleaning medical electronic devices. It has to combine exceptional cleaning, and also address regulatory requirements. Vapor degreasing, when used with modern cleaning fluids, offers an answer to this challenge.

Vapour degreasing is a simple process that is effective at removing contaminants. The low viscosity and surface tension ratings of modern cleaning fluids, combined with their volatility, allow them to clean very effectively even in small crevices and areas that other cleaning options cannot easily infiltrate. This ensures that all the surfaces of the finished component will be effectively cleaned, even under tight-stand-off components.

Environmentally-progressive cleaning fluids also have the benefit of reducing the risk of bioburden – critical when it comes to medical devices. Since vapour degreasing fluids contain no water there is no threat of bacteria growth. When parts are cleaned using a modern cleaning fluid inside a vapour degreaser, the parts exit the machine immediately dry, further eliminating the risk of bioburden and helping to meet the criteria needed for process validation.

Today’s vapour degreasing cleaning fluids are effective at thoroughly removing contamination from medical electronic devices while standing up to the regulatory requirements of the medical industry. Formulations are now cleaner, greener and safer. They are the most sustainable and progressive way to clean and help to maintain the reliability of medical electronic devices.  

For help in determining what cleaning fluid to use for your particular medical devices and types of contamination, it is important to work with a fluid supplier that specialises in both cleaning fluids and vapour degreasing. They can recommend the fluids and the methods that will work best.

Source: Med Tech News

UK Government announces multi million pound funding for Moray College Hub at RAF Lossiemouth

£21-million investment in aerospace campus will form part of the Moray Growth Deal.

 Read More

An aerospace centre of excellence will be established adjacent to RAFLossiemouth, backed by £21-million funding, UK Government Minister for Scotland Iain Stewart announced today.

Following RAF Lossiemouth taking ownership of a strategic facility to house nine new submarine hunting Poseidon maritime patrol aircraft, the Moray Aerospace, Advanced Technology and Innovation Campus will work with both the RAF and private partners to support the fleet.

This will create hundreds of new student places to provide Moray residents with the skills for working in the aviation sector. This will turn Moray into a global centre for aviation engineering and attract further businesses to the region.

Today’s announcement means that the UK Government will invest £21-million in the project as part of the Moray Growth Deal, with a further £12.3-million from local partners.

UK Government minister for Scotland, Iain Stewart, visited RAF Lossiemouth today [Wednesday 12 August]. Speaking after his tour, Mr Stewart said:

Since the Moray deal was announced we have been working to make progress, and I am delighted to confirm the UK Government will invest £21-million in this college hub which will help the region prosper from its aerospace sector.

There is a significant way to go to get our economy back on track after the coronavirus pandemic, but we know that the City and Region Growth Deals will be crucial to Scotland’s economic recovery.

The pandemic has called for extraordinary economic measures, and the UK Government has done everything we can to support jobs and businesses. We have supported 900,000 jobs in Scotland with our furlough and self-employed schemes, including 15,700 in Moray.

The UK Government is investing more than £1.5-billion in City Region and Growth Deals across every part of Scotland. This programme is creating thousands of jobs and opportunities and we will continue to work with the Scottish Government to deliver these.

This announcement follows a visit to RAF Lossiemouth from the Prime Minister at the end of last month. Here, he saw the leading contribution made by the Armed Forces in Scotland, both to crises like the coronavirus response and the day-to-day defence of the UK.


The full list of projects, all subject to business case approval, in the Moray Growth Deal are expected to be announced shortly. The ambitious deal is set to create opportunities for those who choose to live and work in Moray, and is jointly funded by the UK Government (£32.5-million), Scottish Government (£32.5-million), and regional partners (TBC).

The Moray Growth deal funding was announced in July 2019.

Heads of terms on the deal are expected to be signed shortly.

Source: GOV.UK

Cardiff Council deploys wearable to detect early health risks

The ARMED (Advanced Risk Modelling for Early Detection) software, developed by HAS Technology, was adopted by Cardiff Council as part of a response service to put preventative measures into place for its community and residents.

 Read More

ARMED gives users access to data, allowing for better self-management whilst healthcare professionals can be alerted to potential issues.

Twenty individuals have been identified as having a potential ‘falls risk’ and provided with ARMED’s wearable smart watches, so that their sleep and mobility data can remotely be monitored.

Aaron Edwards, implementation and delivery manager at Cardiff Council, and chair of the Assistive Technology Network for Wales, was keen to put preventative measures into place during the Coronavirus pandemic.

He said: “I was worried that the pandemic would increase falls further. Research shows a clear link between those who fall frequently and serious injury or entry into residential care.

“With this assistive technology, our residents are able to monitor their daily health and mobility, and our professional support network will be alerted to any potential risk trends. As this point, we may speak to their GP, the Independent Living Service, or Community Rehab physios and put a plan in place to manage falls prevention.”

Following advice from health and social care professionals in the UK to switch to remote interaction where possible, ARMED has been adapted to be implemented remotely during the pandemic minimising service user contact but maximising the ability for stratifying risk.

Brian Brown, director of HAS Technology’s ARMED service, added: “We are delighted to be working with Cardiff Council on their Telecare project. ARMED has not only shown how assistive technology can be of benefit during a time of crisis, but also how it can support the sector as we move forward to a new normal way of working.

“We are all anticipating additional pressures as the pandemic progresses and our wearable technology really highlights how having remote but real time access to data can support preventative measures. The goal is to ultimately improve the life of the end user and help them to age well.”

Source: Med-Tech Innovation News

What Is The Artificial Intelligence Revolution And Why Does It Matter To Your Business?

What Is The Artificial Intelligence Revolution And Why Does It Matter To Your Business?

 Read More

As a species, humanity has witnessed three previous industrial revolutions: first came steam/water power, followed by electricity, then computing. Now, we’re in the midst of a fourth industrial revolution, one driven by artificial intelligence and big data.

I like to refer to this as the “Intelligence Revolution.” But whatever we call it – the fourth industrial revolution, Industry 4.0 or the Intelligence Revolution – one thing is clear: this latest revolution is going to transform our world, just as the three previous industrial revolutions did.

What makes AI so impactful, and why now?

AI gives intelligent machines (be they computers, robots, drones, or whatever) the ability to “think” and act in a way that previously only humans could. This means they can interpret the world around them, digest and learn from information, make decisions based on what they’ve learned, and then take appropriate action – often without human intervention. It’s this ability to learn from and act upon data that is so critical to the Intelligence Revolution, especially when you consider the sheer volume of data that surrounds us today. AI needs data, and lots of it, in order to learn and make smart decisions. This gives us a clue as to why the Intelligence Revolution is happening now.

After all, AI isn’t a new concept. The idea of creating intelligent machines has been around for decades. So why is AI suddenly so transformative? The answer to that question is two-fold:

We have more data than ever before. Almost everything we do (both in the online world and the offline world) creates data. Thanks to the increasing digitization of our world, we now have access to more data than ever before, which means AI has been able to grow much smarter, faster, and more accurate in a very short space of time. In other words, the more data intelligent machines have access to, the faster they can learn, and the more accurate they become at interpreting the information. As a very simple example, think of Spotify recommendations. The more music (or podcasts) you listen to via Spotify, the better able Spotify is to recommend other content that you might enjoy. Netflix and Amazon recommendations work on the same principle, of course.

Impressive leaps in computing power make it possible to process and make sense of all that data. Thanks to advances like cloud computing and distributed computing, we now have the ability to store, process, and analyze data on an unprecedented scale. Without this, data would be worthless.

What the Intelligence Revolution means for your business

I guarantee your business is going to have to get smarter. In fact, every business is going to have to get smarter – from small startups to global corporations, from digital-native companies to more traditional businesses. Organizations of all shapes and sizes will be impacted by the Intelligence Revolution.

Take a seemingly traditional sector like farming. Agriculture is undergoing huge changes, in which technology is being used to intelligently plan what crops to plant, where and when, in order to maximize harvests and run more efficient farms. Data and AI can help farmers monitor soil and weather conditions, and the health of crops. Data is even being gathered from farming equipment, in order to improve the efficiency of machine maintenance. Intelligent machines are being developed that can identify and delicately pick soft ripe fruits, sort cucumbers, and pinpoint pests and diseases. The image of a bucolic, traditional farm is almost a thing of the past. Farms that refuse to evolve risk being left behind.

This is the impact of the Intelligence Revolution. All industries are evolving rapidly. Innovation and change is the new norm. Those who can’t harness AI and data to improve their business – whatever the business – will struggle to compete.

Just as in each of the previous industrial revolutions, the Intelligence Revolution will utterly transform the way we do business. For your company, this may mean you have to rethink the way you create products and bring them to market, rethink your service offering, rethink your everyday business processes, or perhaps even rethink your entire business model.

Forget the good vs bad AI debate

In my experience, people fall into one of two camps when it comes to AI. They’re either excited at the prospect of a better society, in which intelligent machines help to solve humanity’s biggest challenges, make the world a better place, and generally make our everyday lives easier. Then there are those who think AI heralds the beginning of the end, the dawning of a new era in which intelligent machines supersede humans as the dominant lifeform on Earth. 

Personally, I sit somewhere in the middle. I’m certainly fascinated and amazed by the incredible things that technology can achieve. But I’m also nervous about the implications, particularly the potential for AI to be used in unethical, nefarious ways.

But in a way, the debate is pointless. Whether you’re a fan of AI or not, the Intelligence Revolution is coming your way. Technology is only going in one direction – forwards, into an ever-more intelligent future. There’s no going back.

That’s not to say we shouldn’t consider the implications of AI or work hard to ensure AI is used in an ethical, fair way – one that benefits society as well as the bottom line. Of course, we should do that. But it’s important to understand that; however, you feel about it, AI cannot be ignored. Every business leader needs to come to terms with this fact and take action to prepare their company accordingly. This means working out how and where AI will make the biggest difference to your business, and developing a robust AI strategy that ensures AI delivers maximum value.

Source: Forbes

The COVID-19 Crisis Is A Boost To Educational Technology Companies

Closed schools, universities, and job training centers have made the need for online education  urgent.

 Read More

In the daily flurry of negative news, investors should be careful not to lose sight of the fact that while COVID-19 has caused thousands of companies to struggle, default on their debt, or to declare bankruptcy around the globe, it has also created great business opportunities. With thousands of schools, universities, and job training center closed around the world, many since February, the need for online education for educators, parents, students and life-long learners has never been more urgent.

COVID-19 has caused a significant disruption in the world of education, but it has been a boon to education technology (EduTech) companies. Even when a vaccine for COVID-19 is approved, online education is here to stay filling gaps that already existed in education curricula. Online education has been helping reach underserved populations students as well as students with special needs and disabilities.  Additionally, many parents and educators are likely to use online education to be ready for the next public health or natural disaster.

Anytime any student or client asks me for advice, I always tell them that long gone are the days when people went to a place of employment and stayed there for almost a lifetime. I myself take classes online several times a year to stay-up-to date in my professional field. The EduTech companies below cater mostly to K-12, but many also provide online education to students in higher education as well as to those looking for training for their current or prospective careers.

Source: Forbes

No-Contact Pay Is on the Rise Thanks to Near-Field Communication Technology

COVID-19 has many businesses trying to eliminate card and cash contamination.

 Read More

As the COVID-19 pandemic continues, workplaces around the world are making purposeful changes to keep people healthy. As part of this effort, companies are increasingly gravitating toward contactless payments to minimize contact with potentially contaminated cards and card readers.

Contactless payments are made possible mainly through near-field communication (NFC) or radio frequency identification (RFID) payments. NFC technology is a subset of the RFID used in applications like grocery store scanners.

Whereas RFID information only flows in one direction—from the RFID tag to the device that reads it—NFC offers a two-way transfer of data with embedded chips, such as a smartphone and a payment reader.


A Closer Look at NFC Technology

An NFC chip acts as one part of a required wireless link. It activates when the chip comes close enough so that the receiving device recognizes the signal and completes the connection. An NFC-compatible device must operate at the 13.56 megahertz operating frequency. It also has a wire coil that acts as an antenna and a low-power microchip.

Most modern smartphones have NFC technology inside, but a user must activate it to make the chip work. The settings on the phone usually include an option to turn NFC tech on or off. In this case, the phone is the active device while a product (such as a payment reader) is the passive object—also called an NFC tag. 

The tag does not need a power source, and it only activates once an active device comes into range. Once that happens, electromagnetic induction creates a current in the tag. The active device generates a magnetic field via the wire loop accompanying the phone’s chip. 


The basic structure of data transmission with NFC

The basic structure of data transmission with NFC. Image used courtesy of Rohde & Schwarz

Once the passive device gets close enough to the active device’s magnetic field, the electrons in the passive device’s wire coil start producing a current that matches the one in the transmitting smartphone. This process creates the power necessary for the passive machine, allowing it to sync with the phone and send or receive data. NFC-enabled devices do not require Wi-Fi to work, and there is no need to pair them. 

NFC payments also have a secure element chip protected by a digital signature either associated with the active device’s chip or a cloud-based system. Many analysts assert that NFC is an extremely safe payment option. Even without that digital signature, a hacker must be close enough to the active device to turn on the technology and make the transmission happen.


NFC Tech in Action

RFID tech works at a distance of several feet, but NFC only functions within a few inches. Sometimes, this short-range poses a limitation. On the other hand, it boosts security by eliminating the chance of a person unintentionally authorizing a payment. 

The active device in an NFC transaction is not necessarily a phone. A recent development from NXP, the mWallet 2GO, put the technology into a watch strap. Wearers hold their wrist near the passive payment reader. The mWallet 2GO also offers data protection and encryption to protect people against a wide variety of potential attack types meant to compromise personal information. 


Montblanc's TWIN smart strap includes NXP's Wallet 2GO technology

Montblanc’s TWIN smart strap includes NXP’s Wallet 2GO technology. Image (modified) used courtesy of NXP

NXP also envisions a future where people use NFC technology in smart cities. They could use it to pay for parking or gain entry to buildings, for example. 

Promotional materials can serve as passive devices, too. For example, a movie poster may have an NFC-enabled sticker on it, allowing a person to aim their phone at it and get information. Another option is to use NFC to transfer data between two phones. There’s even a shirt called the CashCuff that lets people pay while wearing the garment by holding their arm up to a passive reader. 


A Promising Technology

NFC-enabled payments happen substantially faster than buying something with cash or a card. Data transfer occurs much slower than some other transmission methods. Its maximum transfer rate of 0.424 megabits per second is less than a quarter of Bluetooth’s maximum speed.

Even electrical engineers who don’t primarily work on payment technology will likely find themselves becoming more familiar with NFC soon. The technology has vast potential and supports consumers’ desire for frictionless data transfers.



Source: All about circuits

Liverpool tech hub makes innovation deal with Unilever

Sensor City

 Read More

Liverpool-based innovation hub Sensor City is to provide London-headquatered consumer giant Unilever with a  package of technical and business support over the next two years, as part of a joint commitment to drive innovation within Liverpool City Region.

Sensor City has secured a two-year extension of their corporate membership deal with brand.

The agreement will see Sensor City, based in the Knowledge Quarter Liverpool (KQ Liverpool) Mayoral Development Zone, deliver  support to the global FMCG producer, comprising of technical project delivery, facilitated access to student and academic researchers and collaborative engagement opportunities.

Louise Goodman, Business and Innovations Manager at Sensor City, said: “We are delighted to be continuing our relationship with Unilever and building on the proven value that our engineers and business support teams can offer.

“Working with high profile companies like Unilever adds another dimension to our dynamic business community and we look forward to supporting Unilever’s innovation agenda over the next two years.”

Unilever’s package of support will be delivered through a call on/call off package, which will ensure a rapid response to new opportunities for innovation and provide agility to meet the demands of Unilever’s business objectives, as they evolve over the coming months.

The initial framework of activity will focus on access to Liverpool City Region’s student and academic talent, integration into the local start-up ecosystem and collaboration opportunities with Sensor City’s partner universities.

Additionally, technical support through Sensor City’s multi-disciplinary engineering team and their £1m laboratory facilities will provide Unilever with access to rapid prototyping and creative R&D capabilities.

Helen Brannon, Research and Development Manager at Unilever, said: “Our membership of Sensor City enables us access to rapid prototyping facilities, potential collaborations with SME companies hosted at Sensor City as well as access to future talent through potential projects with the students of Sensor City.

Sensor City’s ongoing relationship with Unilever extends back to 2017, when the innovation hub ran a two day Hack and Pitch event, with Innovate UK, to find new approaches and solutions for three packaging challenges set by Unilever’s Advanced Manufacturing Centre in Port Sunlight, Wirral.

Over the next 24 months, Sensor City plan to support the connection of Unilever’s research priorities to student project opportunities, within both the University of Liverpool and Liverpool John Moores University.

Source: Business Cloud

Ready-to-Use and Development Services Grow with Injectables Market

Labs 131 Instron

Challenges around injectable products include regulatory requirements, pricing pressures, and an increasingly complex industry landscape.

 Read More

Combination products are on the rise for health outcomes and patient adherence, but it’s a challenging field for manufacturers to navigate merging drug development with device development.

Statista reports that in 2018, research and development spending in the pharma industry totalled $179 billion globally. As companies work to develop injectable biologics for patient needs, many turn to partners to overcome development hurdles (Competitive Drugs Demand Innovative Delivery Devices).

Another way suppliers are helping customers in speeding time to market is by offering pre-sterilized, ready-to-use (RTU) primary packaging to streamline pharmaceutical fill/finish operations. These systems are often designed for clinical development phases and small-batch commercial manufacturing and can include components such as vials, seals, stoppers, cartridges and syringes. This eliminates preparation steps for glass containers (washing, depyrogenizing and sterilizing) so that the manufacturer can focus on space and resources on filling operations. Global Market Insights estimates that the market for depyrogenated sterile empty vials will surpass $1.4 billion by 2026.

How development services help 

Development services can help companies determine factors such as what type of analytical testing is needed, what information goes into the regulatory filing, and what type of information is needed from suppliers so they can enhance their filing with the combination product. It all starts with the molecule—that determines where a manufacturer goes within a regulatory pathway. 

West Pharmaceutical Services has put together an Integrated Solutions Program that seeks to simplify the drug development path regardless of what stage a manufacturer is in. For prefilled syringe systems, for example, it’s important to look at drug/package compatibility and device functionality early on. This is also true when a manufacturer wants to take the prefilled syringe and put it into an auto-injector. “With a cartridge system, you’re putting it into a pen or an on-body system. What things to do they need to think about in each phase of development? What do they need to do in phase 1 to properly evaluate not only the primary packaging but that device? In phase 2 and phase 3? We’ve laid out that roadmap to help them move forward in each stage of development,” says Jennifer Riter from West Pharmaceutical Services.

Stevanato Group offers another development service—a product development collaboration with Cambridge Design Partnership (CDP) for pen-injector technology. The agreement combines CDP’s design and development experience in drug delivery devices with Stevanato Group’s experience in glass containers, tooling, injection moulding, device assembly, and its global commercial network.

RTU options have emerged from numerous suppliers, including Stevanato’s vials in its EZ-fill configuration and DWK Life Sciences’ WHEATON CompletePAK, a customizable set of commonly used RTU vials, stoppers, and seals delivered with USP certifications for traceability and under a single order number for customers.

Development services may also assist with the manufacturing/filling side, or determining what training and on-boarding are needed to get the devices into the clinic and to caregivers. 


Riter notes a few trends across the market. Small molecules may be packaged with a higher pH, so manufacturers must choose containment that avoids leachables in the drug product. For large molecules, manufacturers are using a lot of excipients or surfactants to stabilize the drug, which require additional consideration for compatibility with packaging. “And for a more viscous product, they have to consider how to deliver the drug product in a reasonable amount of time into the patient,” Riter says.

Because of product-specific complexity, looking at types of containment and delivery in the preclinical phase or phase 1 can help to minimize risk and avoid returning to the drawing board after considerable development.


Labs 284 (002)While there are standards such as ISO and compendium studies, manufacturers have to customize and optimize them based on their drug product and delivery system and a partner can help with that development. “With the Office of Combination Products, in terms of the type of information, how it’s put together, and how its ties the drug to the product with the drug, we’ve seen the data expectations have increased.”

Changes to USP 381, and a new chapter in USP 382, focus on the functionality of primary containment systems which now includes prefilled syringe systems and cartridges as well as the vial/stopper/seal. “Also with the ICH Q3D Elemental impurities, customers are trying to understand the elemental impurities that come into their drug product from the primary packaging device over time,” Riter notes.

While Europe does not have an office of combination products, with new expectations for 2020, manufacturers are putting in applications for a medicinal product, combined with a device. 

Mitigating risk at different phases

  • In phase 1, a brand owner may be looking at packaging configurations. If the product has a type of surfactant or excipient that may pull extractables from the stopper, a development firm could help with early compatibility studies with different packaging systems to determine what could be the appropriate type of system.


  • In phase 2, a brand owner is starting to look at extractables/leachables and at container closure integrity (CCI). With CCI, they may turn to a development firm to determine the right technique to look at their product based on their minimum allowable leak limit. “When you think about things like CCI, per 1207, they want you to validate your system with the drug product. You can’t just run a technique and you’re finished. They want to make sure you are testing through shelf-life or the life cycle management of your product, that you’re actually getting robust results and using an accurate test method.”


  • Cell and gene therapies are normally stored under cryogenic and frozen conditions. For a standard biologic, helium leak testing may be appropriate at  -80° C, where a lot of biologics ship. “But cell and gene therapies are stored over their shelf life at -180° C. We can develop a method for CCI via helium leak under cryogenic conditions to monitor real-time over the time product will be in that container.”

Source: Healthcare Packaging 

Virtual Events Vs. In-Person Events: Why You Should Host Your Event Online

Conferences are ubiquitous in most industries. There is something to be said for paying hundreds of dollars in travel and accommodation, showing up at a venue before the sun rises, and sitting in a stuffy conference centre for hours on end.

 Read More

To be honest, in-person events and conferences can be a really great way to distribute content and network, but are they the most effective way to build and connect with an audience?

As businesses and governments look to tackle climate change, there has been a renewed focus in digital alternatives to events, namely virtual events (not to mention the increased interest coming from businesses as they try to manage the fallout from coronavirus event cancellations).

With all this in mind, the question remains: what are better, in-person or virtual events?

But first, let’s get back to basics…

What is a Virtual Event?

Before we can answer that question, it helps to understand exactly what we’re talking about when we refer to virtual events.

A virtual event is a large, multi-session online event that often features webinars and webcasts. The most basic definition of a virtual event is an online event that involves people interacting in a virtual environment, rather than a physical location. They are great because they are highly interactive and give a similar look and feel to a physical event.

Types of Virtual Events

Okay, now that we know what a virtual event is, we can take a look at the different types of virtual events. Yes, there are a number of different types of virtual events – most of which are tailored to specific goals or purposes.

Virtual Open Days


Virtual Open Days are the virtual alternative to traditional physical colleges or university open houses. Choosing a college or university to attend is a life-changing – and often daunting – decision and Virtual Open Days give prospective students the freedom and flexibility to learn more about potential academic institutions, so they can make a more informed decision.

Virtual Conferences

Similar to conventional conferences, virtual conferences are highly engaging and include a series of large sessions run by thought leaders in the industry who present to and interact with attendees. Virtual conferences achieve this by hosting a virtual lobby wherein attendees can choose sessions and streams to “attend.” And, like conventional conferences, virtual conferences offer the opportunity for sponsorship, sell tickets to engage with speakers through polls, chats, and live Q&As.

Virtual Career Fairs

A virtual career fair, or online job fair, is a virtual event that allows employers to connect with job seekers in a virtual environment, rather than a physical one. Virtual career fairs are transforming the way job seekers and potential employers interact. Just like with other types of virtual events, interactive features such as webinars, webcasts, live chat, chat rooms and more, break down geographical barriers and extend the reach of potential employers without losing the valuable engagement that attracts prospects.

Virtual Trade Show

Virtual trade shows are considered the online equivalent of traditional trade shows, but unlike their physical counterparts, virtual trade shows are not limited by their geographical location. Just like with traditional trade shows, attendees interact with hosts and sponsors, getting access to loads of valuable content. And, they can attend virtual trade shows anywhere, anytime – all that’s needed to attend is a computer or mobile device with a good internet connection. 

Virtual Benefits Fairs

A virtual benefits fair is an event designed to help employers and benefits providers communicate and explain the benefits provided to their employees at scale through an interactive online environment, rather than a physical one.

Only 40% of employers help employees understand and maximize their work benefits, so virtual benefits fairs make providing this information to employees more affordable and scalable by removing the physical and time-based barriers that often come with in-person events. This can be especially useful for Enterprise organisations with multiple locations. These benefits fairs result in more informed employees who benefit from understanding the entire scope of their compensation package, rather than just their salaries. 

Pros of Hosting Your Event Online

Now that we know what a virtual event is and what types of events there are, we can really start to explore the benefits of this type of event (and there are heaps of them!). 

  • Flexible: What’s great about virtual events is that they’re flexible and can be tailored to your business needs. If you can think of an event you can host in-person, the odds are that you can host it virtually, you just need to ensure you have the right platform and tools at your disposal.
  • Cost-effective: Venue hire can run in the tens of thousands of dollars just for one day, and that doesn’t even take into account the cost of food and drink, accommodation for speakers, and venue insurance. With virtual events, those costs are non-existent. All you have to pay for is the platform, promotion, and speakers (depending on your business model).
  • Scalable: Most physical venues have a limit to the number of people allowed in, and then you have to take into account the cost of hosting thousands of people. Virtual events make it much easier to scale, meaning you can host more people for a fraction of the cost (WorkCast, for example, can host up to 50,000 attendees per event). This also means you can expand your reach and promote brand awareness.
  • Engaging: 30% of people are more likely to speak to a person in a virtual booth. Maybe that’s because people are scared to approach someone at an event or they just enjoy the anonymity. Either way, virtual events offer tools such as polls, Q&A, live chat, and even a downloadable resources section so your attendees can fully engage with your content. Also, some platforms offer the ability to integrate with widgets, such as, so you can really ramp up the interaction.
  • Environmentally friendly: The costs of hosting an in-person event aren’t just monetary, they can be environmental. U.S. residents made 463.6 million person‐trips for business purposes in 2018, with 38% for meetings and events. That’s more than 175 million trips to meetings and events alone. Virtual events take away the need for people to travel hundreds of miles to venues, and instead allow them to access your event from much closer to home, not only saving businesses money but reducing the impact on the environment.
  • Reliable: Virtual events are nothing if not resilient. There is no need to cancel a virtual event because of weather or even a global pandemic (looking at you here coronavirus). Because attendees can view the event from anywhere, it means that factors that would cripple a physical event aren’t even a consideration.
  • Provide comprehensive analytics and reporting data: Want to know how many people viewed an entire presentation? Or, want to find out how many questions were asked? These are hard stats to pull from in-person sessions but are at your finger-tips with online events. Why is this important? Knowing your viewing stats or engagement scores can really help you fine-tune your content and determine what’s working and what’s not without having to rely on a post-event survey that only 10% of attendees will actually respond to.

Cons of Virtual Events

While virtual events are a great alternative to physical events, there are some cons to be aware of when considering them as an option.

  • Lack of networking opportunities: While there are heaps of opportunities for attendees to engage with speakers and content, there are fewer to interact with one another. You can mitigate this with live attendee chat and social networking events, but there aren’t really any ways to interact with other attendees offline.
  • Distractions: Let’s be honest, offices are full of distractions. For example, as I am writing this blog I am getting IMD and can see that I have a gazillion emails in my inbox. Attendees at a virtual event will have to contend with distractions that aren’t necessarily there in-person, which is why we always recommend attendees and presenters try to find a quiet place to attend the online event
  • No getting away from the office: Otherwise known as a business vacation, attending an event can be an opportunity for people to get out of the office, making them very enthusiastic about participating in conference and event activities.
  • Can limit audience: I’m not talking about audience number, but rather types of audience. If your prospective attendees aren’t all that internet savvy, a virtual event may turn them off, so it’s important to provide potential attendees with links to any FAQs or contact details for event support prior to the event.

How to Host a Virtual Event

If you do decide to host a virtual event, you’ll need to plan out how you are going to do this. You can do so by following these simple steps:

  1. Decide on the type of event you want to host: Are you looking at running a large multi-session virtual event or will a single session webinar or webcast do? The type of event you want to run will inform your platform needs.
  2. Choose a platform: Not all online event platforms are created equally, and not all will have the ability to host the type of event you want to run, so make sure you know exactly what functionality and support your platform provider offers.
  3. Decide whether it will be free to attend or requires tickets: Virtual events are similar to physical ones in that you can gate them and charge people to attend. IF you do decide on a paid model, ensure your platform can accommodate this.
  4. Choose your event environment: You’ll likely have to work closely with your platform provider on this. You will need to decide what type of branding you want if you need room for sponsorship branding, and what types of sessions will be on offer.
  5. Choose your speakers: You don’t need this information immediately but, as with physical events, speakers can be a big draw, so determining who your speakers are and putting their information on your promotional material (including on your reg page)
  6. Decide on your date and time: Keep in mind factors such as audience and speaker availability (though pre-recording and simulive can help on this end).
  7. Promote your webinar: Just as with in-person event, having a multi-channel promotion strategy will ensure the success of your event i.e. social media, email marketing, landing pages and website banners
  8. Create engaging content to captivate your audience: This can either be done by your events/marketing team for smaller events, or by your speakers in more traditional conference layouts. Just make sure you have your content at least a few days ahead of time, so you have time to upload it and rehearse.
  9. Practice, practice, practice: Speaking of rehearsal, it’s important to ensure you know your content (or your speakers do). Scheduling in dry runs means that there won’t be any surprises on the day and your audience will get an awesome experience.
  10. Follow up with your audience: After the event, reach out to your audience – whether it’s with a post-event survey, a link to recorded content, or even an early-bird invite to your next event. It’s important to keep them engaged even after your event is finished.

Virtual Event Ideas

Need some help coming up with some ideas for virtual events? Here are some examples to get you started:

Ashton kutcher lightbulb

South by Southwest (SXSW): SXSW is an annual conference/festival featuring film, interactive media, and music events that take place annually in Austin, Texas. What’s great about SXSW is event organizers stream the keynote speeches online for all to see and enjoy, increasing their accessibility and opening up content to those who can’t get down to Texas.

Idea: Stream panel discussions to extend the reach of physical events you’re already hosting.

AWSome Day Online Conference: This is Amazon event showcases the AWS platform and its core capabilities. It’s free and looks to engage IT managers, system engineers, system administrators, developers, architects, and business leaders, with content will be available in English, French, Spanish, Italian and German.

Idea: Include free training or certification as part of your online conference to increase the ROI for attendees.

Graduate Prospects UK Virtual Career Fair: Prospects has worked at the heart of the higher education sector for almost 50 years and is the organisation of the only career to invest its profits back into education. In June 2019, they started running virtual careers fairs for graduates to increase their prospects after graduation. Their first event garnered over 2500 registrants and had multiple sessions with recruiters.

Idea: Use multiple sessions to create streams for attendees, meaning they can focus on the content that most interests them.

Prospects Auditorium 2 jpg

CBI Annual Conference: CBI is one of the UK’s premier business organisations, directly influencing over 7 million professionals. Their annual conference is their flagship event, with guest speakers previously including the Prime Minister, the Leader of the Opposition and many other leading business speakers. Each year, they replicate their annual conference with a live conference stream, extending their reach to audiences attending out in the regions and internationally.

Idea: Gate exclusive events with a pay-to-attend model. This way you can increase accessibility while also generating revenue.

As you can see, virtual events are an engaging, flexible, and highly resilient way to run your events. Whether you replace your physical event completely or supplement it with digital content, taking your conferences, trade shows, and careers fairs online can increase your reach, minimize your costs, and delight your audience.

Google Engineers ‘Mutate’ AI to Make It Evolve Systems Faster Than We Can Code Them

main article image

 Read More

  • Much of the work undertaken by artificial intelligence involves a training process known as machine learning, where AI gets better at a task such as recognising a cat or mapping a route the more it does it. Now that same technique is being used to create new AI systems, without any human intervention.


    For years, engineers at Google have been working on a freakishly smart machine learning system known as the AutoML system (or automatic machine learning system), which is already capable of creating AI that outperforms anything we’ve made.

    Now, researchers have tweaked it to incorporate concepts of Darwinian evolution and shown it can build AI programs that continue to improve upon themselves faster than they would if humans were doing the coding.

    The new system is called AutoML-Zero, and although it may sound a little alarming, it could lead to the rapid development of smarter systems – for example, neurally networked designed to more accurately mimic the human brain with multiple layers and weightings, something human coders have struggled with.

    “It is possible today to automatically discover complete machine learning algorithms just using basic mathematical operations as building blocks,” write the researchers in their pre-print paper. “We demonstrate this by introducing a novel framework that significantly reduces human bias through a generic search space.”

    The original AutoML system is intended to make it easier for apps to leverage machine learning, and already includes plenty of automated features itself, but AutoML-Zero takes the required amount of human input way down.


    Using a simple three-step process – setup, predict and learn – it can be thought of as machine learning from scratch.

    The system starts off with a selection of 100 algorithms made by randomly combining simple mathematical operations. A sophisticated trial-and-error process then identifies the best performers, which are retained – with some tweaks – for another round of trials. In other words, the neural network is mutating as it goes.

    When new code is produced, it’s tested on AI tasks – like spotting the difference between a picture of a truck and a picture of a dog – and the best-performing algorithms are then kept for future iteration. Like survival of the fittest.

    And it’s fast too: the researchers reckon up to 10,000 possible algorithms can be searched through per second per processor (the more computer processors available for the task, the quicker it can work).

    Eventually, this should see artificial intelligence systems become more widely used, and easier to access for programmers with no AI expertise. It might even help us eradicate human bias from AI because humans are barely involved.

    Work to improve AutoML-Zero continues, with the hope that it’ll eventually be able to spit out algorithms that mere human programmers would never have thought of. Right now it’s only capable of producing simple AI systems, but the researchers think the complexity can be scaled up rather rapidly.

    “While most people were taking baby steps, [the researchers] took a giant leap into the unknown,” computer scientist Risto Miikkulainen from the University of Texas, Austin, who was not involved in the work, told Edd Gent at Science. “This is one of those papers that could launch a lot of future research.”

    The research paper has yet to be published in a peer-reviewed journal, but can be viewed online at

After Covid-19: Anticipate the skills gap with skills mapping

After Covid-19: Anticipate the skills gap with skills mapping

It is close to impossible to predict how the world will change after the Covid-19 pandemic crisis is over and how people will evolve. 

 Read More

But one thing is certain, the way we work will be deeply transformed (and already has!). And with transformation comes new skills to master and new roles to create.

How cutting edge materials are shaping the future of 3D printing

The latest edition of the AM Focus 2020 Series addresses the cutting edge of additive manufacturing materials today: technical ceramics, continuous fibre-filled composites, refractory metals and high-performance polymers.

 Read More


How cutting edge materials are shaping the future of 3D printing

3dpbm has announced the fourth edition of the company’s AM Focus eBook series. This latest publication spotlights a far-reaching topic within the additive manufacturing industry: advanced materials.

3dpbm sees this group as creating numerous opportunities across the aerospace, automotive, defence, medical, electronics and dental industries, among others.

Comprising over 50 pages, this eBook provides an analysis of the advanced materials landscape — with market insights from leading industry analyst group SmarTech Analysis — as well as a glimpse into the work of some of the leading companies in ceramics and advanced composites.

Special features zoom in on segment leaders and pioneers such as 3DCERAM SINTO, 9T Labs and Lithoz. These companies, like most that work in advanced materials for AM, are driven by innovation and have a unique vision for the future of additive.

The eBook also addresses many of the most relevant materials and AM hardware companies that are pushing the boundaries of ceramics, composites, refractory metals and advanced polymers.

  • Ceramics companies include 3DCERAM, ExOne, Lithoz, voxeljet and XJet.
  • Composites companies include 9T Labs, Anisoprint, Arevo and Markforged.
  • Refractory metal companies include HC Starck, Heraeus and LPW/Carpenter Technology.
  • High-performance polymer companies include EOS, OPM, Roboze, SABIC, Solvay and Stratasys.

“We at 3dpbm have been eagerly following the progress of advanced materials in additive manufacturing for several years, and we are pleased to put the segment in the spotlight in a new, reader-friendly way,” commented 3dpbm’s editor in chief Tess Boissonneault. “We expect this to be one of our most exhaustive publications on this topic to date and are already looking forward to the next eBook edition, which will focus on sustainability in 3D printing, another topic we—and the AM industry at large—care very much about.”

3dpbm’s eBook can be viewed or downloaded on the 3dpbm website here. To maintain an open discourse and open access to AM news, the publication is free to access.

Source: Aerospace Manufacturing

Medtech firm Enmovi secures £2.5M for wearable tech research 

With the number of women employed in the digital workforce hovering around 17% for the past decade, more needs to be done to diversify the industry

 Read More


(L-R) Minister for Trade, Investment and Innovation Ivan McKee and EnMovi CEO Roman Bensen

MedTech firm EnMovi has secured a £2.5m grant from Scottish Enterprise to power new research and development.

The grant will allow the firm to research new wearable technology which will work alongside an app for patients. It is expected the app will provide rehabilitation guides and home exercise plans to patients as well as contact with healthcare professionals.

The Scottish firm was incorporated in October of last year, and is a subsidiary of US-based OrthoSensor and McLaren Applied, a technological offshoot of the McLaren Group which includes Formula 1 cars.

Its new research and development base at the University of Strathcylde’s Inovo building is expected to create 19 jobs.

Trade, investment and innovation minister Ivan McKee said: “This funding will support EnMovi to capture data and develop wearable technology.

“This will allow for less invasive surgery and faster recovery times for patients.

“This project, which will see a new research and development centre established at the University of Strathclyde’s Inovo building, also brings exciting employment opportunities and will help establish Scotland at the forefront of research into this cutting-edge new technology.”

Source: Business Cloud


Ten years on, why are there still so few women in tech?

With the number of women employed in the digital workforce hovering around 17% for the past decade, more needs to be done to diversify the industry

Read More

7 Artificial Intelligence Trends to Watch in 2020


Here we explore some of the main AI trends predicted by experts in the field. If they come to pass, 2020 should see some very exciting developments indeed.

 Read More

Artificial Intelligence offers great potential and great risks for humans in the future. While still in its infancy, it is already being employed in some interesting ways. 

Here we explore some of the main AI trends predicted by experts in the field. If they come to pass, 2020 should see some very exciting developments indeed.

What are the next big technologies?

ai trends 2020 data
Source: Geralt/Pixabay

According to sources like Forbes, some of the next “big things” in technology include, but are not limited to: 

  • Blockchain
  • Blockchain As A Service
  • AI-Led Automation
  • Machine Learning
  • Enterprise Content Management
  • AI For The Back Office
  • Quantum Computing AI Applications
  • Mainstreamed IoT

What are some of the most exciting AI trends?

According to sources like The Next Web, some of the main AI trends for 2020 include: 

  • The use of AI to make healthcare more accurate and less costly
  • Greater attention paid to explainability and trust 
  • AI becoming less data-hungry
  • Improved accuracy and efficiency of neural networks
  • Automated AI development
  • Expanded use of AI in manufacturing
  • Geopolitical implications for the uses of AI

What AI trends should you watch in 2020?

Further to the above, here are some more AI trends to look out for in 2020. 

1. Computer Graphics will greatly benefit from AI

One trend to watch in 2020 will be advancements in the use of AI in computer-generated graphics. This is especially true for more photorealistic effects like creating high fidelity environments, vehicles, and characters in films and games. 

Recreating on screen a realistic copy of metal, the dull gloss of wood or the skin of a grape is normally a very time-consuming process. It also tends to need a lot of experience and patience from a human artist.

Various researchers are already developing new methods of helping to make AI do the heavy work involved in creating complex graphics. NVIDIA, for example, has already been working on this for several years.

They are using AI to improve things like ray tracing and rasterization, to create a cheaper and quicker method of rendering hyper-realistic graphics in computer games. 

Researchers in Vienna are also working on methods of partially, or even fully, automating the process under the supervision of an artist. This involves the use of neural networks and machine learning to take prompts from a creator to generate sample images for their approval. 

More from Interesting Engineering

2. Deepfakes will only get better, er, worse

Deepfake is another area that has seen massive advancement in recent years. 2019 saw a plethora of, thankfully humorous, deepfakes that went viral on many social media networks. 

But this technology will only get more sophisticated as time goes by. This opens the door for some very worrying repercussions which could potentially damage or destroy reputations in the real world. 

With deepfakes already becoming very hard to distinguish from real video, how will we be able to tell if anything is fake or not in the future? This is very important, as deepfakes could readily be used to spread political disinformation, corporate sabotage, or even cyberbullying. 

Google and Facebook have been attempting to get out ahead of the potential negative aspects by releasing thousands of deep fake videos to teach AI’s how to detect them. Unfortunately, it seems even AI has been stumped at times. 

3. Predictive text should get better and better

Predictive text has been around for some time now, but by combining it with AI we may reach a point where the AI knows what you want to write before you do. “Smart” email predictive text is already being tested on programs like Gmail, for example. 

If used correctly, this could help users speed up their writing significantly, and could be especially useful for those with physical conditions that make typing difficult. Of course, many people will find themselves typing out the full sentence anyway, even if the AI correctly predicted their intentions. 

How this will develop in 2020 is anyone’s guess, but it seems predictive text may become an ever-increasing part of our lives. 

4. Ethics will become more important as time goes by

As AI becomes ever-more sophisticated, developers will be under more pressure to keep an eye on the ethics of their work. An ethical framework for the development and use of AI could define how the human designers of AI should develop and use their creations, as well as what AI should and should not be used for. 

It could also eventually define how AI itself should behave, morally and ethically. Called “Roboethics” for short, the main concern is preventing humans from using AI for harmful purposes. Eventually, it may also include preventing robots and AI from doing harm to human beings. 

Early references to Roboethics include the work of author Isaac Asimov and his “Three Laws of Robotics”. Some argue that it may be time to encode many of Asimov’s concepts into law before any truly advanced AIs are developed. 

5. Quantum computing will supercharge AI

Another trend to watch in 2020 will be advancements in quantum computing and AI. Quantum computing promises to revolutionize many aspects of computer science and could be used to supercharge AI in the future.

Quantum computing holds out the hope of dramatically improving the speed and efficiency of how we generate, store, and analyze enormous amounts of data. This could have enormous potential for big data, machine learning, AI, and privacy. 

By massively increasing the speed of sifting through and making sense of huge data sets, AI and humanity could benefit greatly. It could also make it possible to quickly break virtually any encryption – making privacy a thing of the past. The end of privacy or a new Industrial Revolution? Only time will tell.

6. Facial recognition will appear in more places

Facial recognition appears to be en vogue at the moment. It is popping up in many aspects of our lives and is being adopted by both private and public organizations for various purposes, including surveillance. 

Artificial Intelligence is increasingly being employed to help recognize individuals and track their locations and movements. Some programs in development can even help detect individual people by analyzing their gait and heartbeat.

AI-powered surveillance is already in place in many airports across the world and is increasingly being employed by law enforcement. This is a trend that is not going away anytime soon. 

7. AI will help in the optimization of production pipelines

The droid manufacturing facility in Star Wars Episode II: The Clone Wars might not be all that far, far away. Fully autonomous production lines powered by AI are set to be with us in the not-too-distant future. 

While we are not quite there yet, AI and machine learning are being used to optimize production as we speak. This promises to reduce costs, improve quality, and reduce energy consumption for those organizations who are investing in it. 

Source: Interesting Engineering

By Christopher McFadden

Importance of Soft Skills for Software Architects

Very often software architects get a reputation for being cracks in programming and in building solid architecture but they have problems with project management or relationships with the clients.

 Read More

Soft skills are very important nowadays. Whereas hard skills can be learned and perfected over time, soft skills are more difficult to acquire and change. Actually, I would say that the importance of soft skills in your job search and overall career is greater than you think because they help facilitate human connections. Soft skills are key to building relationships, gaining visibility, and creating more opportunities for advancement. And this article is about the importance of soft skills for software architects.

The importance of soft skills. What are soft skills, and why do you need them?

You shouldn’t underestimate the importance of soft skills. Basically, you can be the best at what you do, but if your soft skills aren’t good, you’re limiting your chances of career success.

Soft skills are the personal attributes you need to succeed in the workplace. In other words, soft skills are a combination of social skills, communication skills, flexibility, conflict resolutions and problem-solving skills, critical thinking skills, emotional intelligence among others that enable people to effectively navigate their environment, work well with others, perform well, and achieve their goals with complementing hard skills.

Soft skills are the difference between adequate candidates and ideal candidates. In most competitive job markets, recruitment criteria do not stop at technical ability and specialist knowledge. Employers look for a balance of hard and soft skills when they make hiring decisions. For example, employers value skilled workers with a track record of getting the job done on time. Employers also value workers with strong communication skills and a strong understanding of company products and services.

Even though you may have exhaustive knowledge of your area, you will find it difficult to work with people and retain your project if you lack the soft skills of interpersonal skills and negotiation. And soft skills are not just important when facing external customers and clients, they are equally important when it comes to interacting with colleagues. Soft skills relate to how you work with others. Employers value soft skills because they enable people to function and thrive in teams and in organisations as a whole. A productive and healthy work environment depends on soft skills. After all, the workplace is an interpersonal space, where relationships must be built and fostered, perspectives must be exchanged, and occasionally conflicts must be resolved.

Essential soft skills for being a good software architect

In Apiumhub we believe that the most successful architects that we have met possess more than just great technical skills. They also have qualities that enable them to work well with people.

There are a lot of brilliant technologists who can solve just about any technical problem but are so arrogant that people despise working with them. For example, If you look at the Microsoft Architect program, you will notice that there is a set of competencies that go well beyond technical skills. These competencies were based on focus groups from companies large and small. One common theme from these focus groups was that the importance of soft skills is huge. In fact, they identified more soft competencies than technical competencies. In their view, the soft skills are what separate the highly skilled technologist from the true software architect.

The International Association of Software Architects (IASA) has also gone through a detailed analysis and polled its members to determine the skills necessary to be a successful software architect, the importance of soft skills was highlighted.

In Apiumhub, we also believe that the most successful architects we know are able to increase their effectiveness by combining their technical and nontechnical skills. And the successful technical solution requires three distinct soft skills: business alignment, perspective awareness, and communication.

Most software projects begin with some type of requirements document that drives most of the technical decisions or at least an architecture document that demonstrates how the architecture meets business needs. The issue generally is alignment at the strategic level. Usually, the software architect can discuss the business requirements, but it is surprising how often the architect cannot explain the project in terms that the CFO would understand. There is a lack of understanding of the real business drivers and the detailed financial implications versus the business requirements. It is the critical factor that drives the real project decisions. Being software architect implies thinking about your projects like a CEO and CFO. Invest the time up front to dissect the business drivers for the project, and if possible, determine the true financial impact of the costs and benefits of the project.

You need to think as an architect and not always accept clients’ demands as sometimes it is just impossible to do what you are asked to achieve. Use business drivers instead of requirements as your guide for developing the solution architecture. You need to keep an eye on business throughout the project lifecycle to maintain the appropriate flexibility in the project.

Also, you should constantly evaluate how your methodology maintains business alignment during the project life cycle. In other words, software architect should think about scalability, performance and cost reduction.

So, let’s look at the most demanded soft skills for software architects


You need to be an example for your team, be a person they would like to be. It is also about defining and communicating vision and ideas that inspire others to follow with commitment and dedication. You need to provide direction, and to lead, you need to know where you are going and make the decisions that will get you there. Understanding people is key here as you need to know how to explain your decisions.


In our opinion, communication is the most important soft skill! Whether it is oral or written communication skills. This means being able to actively listen to others and explain your ideas writing and verbally to an audience that way that achieves the goals you intended with that communication. Communications skills are critical as for internal teams as for dealing with clients. And communication is also an important aspect of leadership since leaders must be able to delegate clearly and comprehensively.

System thinking

Understand decisions and constraints in the wide scope. It involves the techniques and thinking processes essential to setting and achieving the business’s short term and long term priorities and goals. And you decisions should be aligned with the overall business of the company.


It is about adaptability, about willing to change, about lifelong learning, about accepting new things. Really, don’t underestimate the ability to adapt to changes. In today’s rapidly evolving business environment, the ability to pick up on new technologies and adjust to changing business surroundings is critically important. Flexibility is an important soft skill, as it demonstrates an ability and willingness to acquire new hard skills and an open-mindedness to new tasks and new challenges.

Interpersonal skills

It is all about cooperation, about getting along with others, being supportive, helpful, collaborative. You should be effective at building trust, finding common ground, having emotional empathy, and ultimately building good relationships with people at work and in your network. People want to work with people they like, or think they’ll like – people who are easygoing, optimistic, and even fun to be around regardless of the situation. Because at the end of the day if you can’t connect with someone, then you will never be able to sell your idea – no matter how big or small it may be.

Positive attitude

Positive attitude? It means being optimistic, enthusiastic, encouraging, happy, confident.

This soft skill can be improved by offering suggestions instead of mere criticism, being more aware of opportunities and complaining less. Experience shows that those who have a positive attitude usually have colleagues that are more willing to follow them. People will forget what you did, but people will never forget how you make them feel.


You should be accountable, reliable, get the job done, self-disciplined, you should want to do well. Don’t forget that you will be an example for others.

Sharing knowledge skills

Working in a team means helping each other, means sharing knowledge, Companies don’t want to have a brilliant software architect who is never ready to share his knowledge with others. Sharing knowledge you grow your team of high-quality tech experts.

Critical Thinking

The ability to use reasoning, past experience, research, and available resources to fundamentally understand and then resolve issues. For example, Bill Gates reads 50 books each year, most of them nonfiction and selected to help him learn more about the world. Critical thinking involves assessing facts before reaching a conclusion. Software architects are sometimes faced with a handful of possible solutions, and only critical thinking will allow them to quickly test each scenario mentally before choosing the most efficient one.


Planning and effectively implementing projects and general work tasks for yourself and others is a highly effective soft skill to have.


Employers are looking for employees that take initiative, are reliable. Sometimes, CEOs don’t have the time to think about tech issues, so software architect should take an initiative and cover tech area of the business.

Problem solving

Employers want professionals who know how and when to solve issues on their own, and when to ask for help. Problem-solving does not just require analytical, creative and critical skills, but a particular mindset: those who can approach a problem with a cool and level head will often reach a solution more efficiently than those who cannot. This is a soft skill which can often rely on strong teamwork too. Problems need not always be solved alone. The ability to know who can help you reach a solution, and how they can do it, can be a great advantage. It is also about being able to coordinate and solicit opinions and feedback from a group with diverse perspectives to reach a common, best solution.

Time management

Time management is more than just working hard. It means making the most of each day and getting the most important things done first, priorities. If necessary, the ability to delegate assignments to others when needed is a part of it. Many jobs come with demanding deadlines and occasionally high stakes. Recruiters and clients prize candidates who show a decisive attitude, an unfaltering ability to think clearly, and a capacity to compartmentalize and set stress aside.

Never stop learning

Learning is a never-ending process. There is always someone you can learn from and some abilities you can improve or adjust. What matters is your willingness to learn.

There is a very good article about soft skills written by wikijob, which actually inspired us to write an article about soft skills specifically for a software architect. And in conclusion, I want to say that every software architect should understand the importance of soft skills. Software Architect should find a balance between hard skills and soft skills to be truly good in what they are doing and how they are doing it.



Total Quality and the Meaning of Data Integrity

Due to their direct impact on the health and even lives of patients, quality control and quality assurance are of paramount importance in the Life Sciences industry.

 Read More

Due to their direct impact on the health and even lives of patients, quality control and quality assurance are of paramount importance in the Life Sciences industry. All stakeholders; government agencies, manufacturers, distributors and healthcare professionals, therefore, take the issue of quality very seriously. We asked 5 questions to Alban van Landeghem, Sr. Life Sciences Business Consultant at Dassault Systèmes about Total Quality, a concept of high priority for Life Sciences solutions at Dassault Systèmes.

What is Total Quality?

“Total Quality[1] is a systemic view over the quality of the product and all processes related to the entire development and production process,” says Van Landeghem.  “In the Life Sciences industry, everyone considers the end-user of a drug or device:  the patient. In order to provide the patient with the best possible supportive care, each step in the therapeutic  solution lifecycle from development, manufacturing, distribution and management of bulk, intermediate or even final materials needs to be considered.”  Total Quality is the integrated framework that inserts controls and best practices at each step of the product lifecycle.

This total quality objective has one goal: provide the best efficacy and safety for the manufacturing of medical solutions, such as described in the International Council for Harmonisation[2] standards.

While Total Quality concepts have been developed first in the automotive industry[3] for understandable reasons, pharmaceutical quality is of equal if not even more importance.  “Users choose to drive a car knowing the risks, patients suffer their illness without being able to choose their drugs”.  Therefore, a systemic quality approach guarantees a level of confidence of appropriate efficacy and safety as accepted by market authorization.

What does Total Quality consist of?

Total Quality consists of the control of all elements, processes and behaviours that lead to the manufacturing of the product as designed and as registered. “For years, the Life Sciences industry has been very committed to delivering the best in class products to patients. To do so, standards and guidelines are continuously improved. For example; drug manufacturing guidelines such as Good Manufacturing Practices[4] [5]( GMPs)  advise companies on how to manage their quality systems, but also how to maintain their human resources knowledge and know-how.”

Manufacturers need to consider the importance of having the right people in the correct role, in the correct location, with the required knowledge and training.  GMPs also address buildings and facilities configurations: control of energy used in production, sanitation, calibration and validation of equipment.

On top of this comes the required records and documentation. Any process needs to be described in standard operating procedures (SOPs) and Work Instructions (WI) in order to convey only the best in class practices. Any product (bulk, intermediary, final) being released needs to be checked through quality control processes. As quality control was typically processed at the end of operations, new approaches considering Quality by Design can involve Process Analytical Technology tools as well as allowing control to take place earlier in the process.

Once established, these quality control activities need to be audited and tracked to ensure products suit the established specifications in the marketing authorisation file.

The reason “Total Quality” is so comprehensive in pharmaceutical development is because drugs should be manufactured as designed and as registered. “And it is important to note that Total Quality extends to the whole lifecycle of the drug,”

What is needed for Total Quality?

“To streamline Total Quality, Life Sciences companies need, above all, a complete and 360° vision of their enterprise operations. Dassault Systèmes delivers management of quality processes through the 3DEXPERIENCE platform, which helps users answer the specific needs of several industries – Life Sciences included”.

“The 3DEXPERIENCE Platform provides roles for collaboration, design, simulation and real-world analysis that will help, when properly deployed and installed, the final decision-maker to envision, based on digital experiences, the best decisions to take. The 3DEXPERIENCE Compass will indeed guide the decision-maker in its quality journey.”

What is the role of Data Integrity in Total Quality?

Data integrity[6] is not a new concept especially not in Life Sciences. Data integrity was as important for paper-based records as it is for electronic records. Data integrity refers to the completeness, consistency, and accuracy of data.

It is a minimal requirement to guarantee trust among regulatory bodies, the Life Sciences Industry ecosystem and not least patient communities.

These data integrity requirements are applicable to each step of the development of a drug or therapeutic device, from research, lab development and experimentation, manufacturing, distribution, and finally, administration.

Data Integrity processes are a prerequisite to ensure the end-user that drugs or devices are manufactured by processes and without the corruption of data, delivering the exact benefits as expected and approved by regulatory bodies.

Data Integrity plays a very important role in gaining the trust of regulatory bodies. “For one, involved IT solutions need to be able to guarantee there has not been any information breach and that all data has been inputted by authorised staff.” Data also needs to follow ALCOA guidelines: attributable, legible, contemporaneous, original and accurate. “So you need to know who recorded the data, it needs to be readable, recorded at the time the work is performed, and in the right protocol from the primary data source, and of course complete and free from error,” Van Landeghem summarises.

How do you guarantee Data Integrity?

Industry standards like ALCOA guidelines have consequences for the use of technology in quality control. “The platform needs to be controlled, verified, validated and secure,” Van Landeghem says. “That means strong password protection. It should also take into account the stage of development provided by R&D departments.”

By Dassault Systems Blogs3ds

Source: Alban Van Landeghem

A national strategy on the sharing of NHS data with industry

The UK Government has announced a new framework on how the NHS shares data with researchers and innovators, and a new National Centre of Expertise to provide specialist advice and guidance to the NHS on agreements for use of data.

 Read More

The Department of Health and Social Care’s guidance published on Monday provides a framework that aims to help the NHS realise benefits for patients and the public where the NHS shares data with researchers.

NHSX to host Centre of Expertise

The Centre will:

  • Provide commercial and legal expertise to NHS organisations – for potential agreements involving one or many NHS organisations, such as cross-trust data agreements or those involving national datasets.
  • Provide good practise guidance and examples of standard contracts and methods for assessing the “value of different partnership models to the NHS”.
  • Signpost NHS organisation to relevant expert sources of guidance and support on matters of ethics and public engagement, both within the NHS and beyond.
  • Build relationships and credibility with the research community, regulators, NHS and patient organisations, including developing “insight into demand” for different datasets and “opportunities for agreements that support data-driven research and innovation”.
  • Develop benchmarks for NHS organisations on “what ‘good’ looks like in agreements involving their data”, and setting standards on transparency and reporting.

The full policy framework document underpinning the Centre’s functions will be published later this year, with plans to recruit for the Head of the Centre over the coming months to enable it to commence work this year.

Five principles to support the use of data-driven innovations in the NHS

  • Any use of NHS data must have an “explicit aim” to improve the health, welfare and/or care of patients in the NHS, or the operation of the NHS, such as the discovery of new treatments, diagnostics and other scientific breakthroughs. And where possible, the terms of any arrangements should include quantifiable and explicit benefits for patients which will be realised as part of the arrangement.
  • NHS organisations entering into arrangements involving their data, individually or as a consortium, should ensure they “agree fair terms for their organisation and for the NHS as a whole”. Boards of NHS organisations “should consider themselves ultimately responsible for ensuring that any arrangements entered into by their organisation are fair”.
  • NHS organisations “should not enter into exclusive arrangements for raw data held by the NHS, nor include conditions limiting any benefits from being applied at a national level, nor undermine the wider NHS digital architecture, including the free flow of data within health and care, open standards and interoperability”.
  • Any arrangements agreed by NHS organisations should be transparent to support public trust and confidence in the NHS.
  • Any arrangements agreed by NHS organisations should fully adhere to all applicable national-level legal, regulatory, privacy and security obligations, including the National Data Guardian’s Data Security Standards, the General Data Protection Regulation and the Common Law Duty of Confidentiality.

Data agreements going forward

This latest iteration of the principles should be factored into decisions taken by the NHS and partners when entering into data agreements.

But NHS organisations are reminded that agreements should not be entered into which “grant one organisation sole (exclusive) right of access to or use of raw NHS data, either patient or operational data”.

The principles are intended to cover two types of agreements those:

  • involving data entered into by all NHS organisations, at the primary (GPs), secondary and tertiary care levels, including relevant data from organisations contracted and funded to deliver NHS services; and
  • involving a commercial partner or where the outputs could be commercialised, regardless of the type of organisation the NHS is partnering with.

The department plans to publish another iteration of the principles in a new policy framework later this year.

By Stuart Knowles

Source: Mills & Reeve

A new dawn in engineering and innovation as UCLan’s new centre is launched

A new dawn in engineering and innovation has been unveiled today as UCLan’s Engineering Innovation Centre is officially launched.

 Read More

Looming large on the Preston skyline, the £35m teaching and research facility engages directly with industry and provides students with real-world experience on live, engineering-related projects.

The aim of the Engineering Innovation Centre (EIC) is to provide courses which respond to industry demand to improve productivity across the North West.

The strategy aims to support the innovation needs of 1,300 regional small and medium enterprises now and in the future.

The EIC will act as one of the driving forces behind the Lancashire Industrial Strategy as well as national industrial strategy, addressing the need for innovation and producing the next generation of world-class engineers.

Research and teaching facilities include a 3D printing lab, an advanced manufacturing workshop, an intelligent systems facility, a motorsports and air vehicles lab, a high-performance computing lab, a flight simulator suite as well as a fire, oil and gas facility.

Read more: Preston’s university pledges to put city at its heart

To date, the EIC is the largest single investment in Lancashire’s educational infrastructure establishing UCLan as one of the UK’s leading universities for engineering innovation.

Identified as a signature project within Lancashire’s Strategic Economic Plan, the EIC secured £10.5 million worth of funding via the Lancashire Enterprise Partnerships’ Growth Deal with the Government.

Read more: Turkish BAE interns given tour of Preston

Professor Graham Baldwin, vice-chancellor at UCLan, said: “The provision of practice-based learning has always been a strength of this University and now, through the EIC and our links with industry, we will ensure our students gain exposure to even greater levels of applied, real-world learning.

“Our strategy is to ensure the University is at the forefront of future skills development enabling Lancashire and the North West region to lead the new ‘digital’ industrial revolution which is now upon us.”

The new facility has also received £5.8 million from the European Regional Development Fund (ERDF) and £5 million from HEFCE’s STEM Capital Fund.

The EIC forms part of the University’s £200 million Masterplan, which also includes a new student support centre, improvements to the public realm and highways around the Adelphi roundabout as well as new social spaces facilities and a new multi-faith centre, all at the Preston campus.

Working in partnership together, SimpsonHaugh and Reiach and Hall Architects designed the EIC, which was built by main contractor, BAM Construction.

David Taylor, pro-chancellor and chair of the University Board, added: “The EIC is not only a significant asset to the University but also the county, wider region and the UK.

“It will act as one of the driving forces behind the industrial strategy both on a regional and national scale while cementing Lancashire’s position as a national centre of excellence for aerospace, advanced engineering and manufacturing.”

Minister for the Northern Powerhouse and Local Growth, Rt Hon Jake Berry MP, said: “We are committed to boosting economic growth across the Northern Powerhouse and levelling up every place in the UK as we prepare to leave the EU on 31 October.

“Thanks to £10.5 million of investment from the Government’s Local Growth Fund, the University of Central Lancashire’s flagship Engineering Innovation Centre will play an important role in cementing the North’s long-standing reputation for world-class further education, scientific innovation and engineering excellence.

“The advances made and skills learned at this pioneering facility will have far-reaching benefits from equipping young people for well paid, highly skilled jobs to technological advances supporting manufacturing businesses throughout the North and around the world.”

Steve Fogg, Chair of the Lancashire Enterprise Partnership, added: “The LEP has invested £10.5m Growth Deal funding towards creating this world-class centre of excellence for high technology manufacturing which will support innovation in local businesses and supply the skilled and talented engineers they need to grow and succeed.

Lancashire is already the country’s number one region for aerospace production and advanced manufacturing.

“By funding projects like the EIC, the development of the Advanced Manufacturing Research Centre at the Samlesbury Aerospace Enterprise Zone and new education and training facilities across the county, the LEP is investing in the facilities and the skilled workforce of the future needed for the sector to maintain and build on its leading position, compete on the global stage and take advantage of opportunities in emerging markets.

“Our £320m investment programme is supporting strategically important projects like this all across Lancashire which, together, will drive substantial economic growth for years to come, create thousands of new jobs and homes and attract £1.2bn in private investment.”

By Rachel Smith

Source: Preston City Centre, UCLan Blog Preston


No-deal Brexit planning for life sciences businesses – new guidance and scenario planning

Brexit uncertainty remains a fact of life for business. Next week’s Parliamentary vote on the Withdrawal Agreement is unlikely to resolve matters. For the time being, planning for no-deal is set to stay with us as a time-consuming and costly distraction from other priorities.

 Read More

Clearly the regulatory issues for life sciences businesses are substantial, with most regulation based on EU law and much of it implemented through EU institutions. New guidance from Government addresses a number of areas and helps to put some of the issues in context.

Medicines regulator, the MHRA brings together relevant Government guidance and communications with industry on its website – Making a success of Brexit. The latest addition to this collection is a Further guidance note produced as a response to the consultation on draft legislation and giving more detail on the arrangements in the event of no deal. We highlight below a few points from that guidance:

Medicines Regulation

The MHRA will take on regulation for the UK market. The guidance puts forward a package of measures largely replicating the European system and offering some attractive features to maintain the UK’s competitiveness as a research and development location. The proposals include:

  • Grandfathering of Centrally Authorised Products:[1] Transitional legislation will ensure that Centrally Authorised Products will benefit from an automatic UK marketing authorisation for a limited period. Marketing authorisation holders can opt-out of this “grandfathering” process. If they wish to retain the UK MA, MAHs will have to provide baseline data for grandfathered UK MAs by 29 March 2020. Processing of variations will require at least basic baseline data to have been submitted.
  • MA assessment routes: new assessment procedures for products containing new active substances and biosimilars are planned. These will include a 67-day review for products benefiting from a positive EU CHMP opinion, a full accelerated assessment for new active substances taking no more than 150 days and a “rolling review” process for new active substances and biosimilars still in development.
  • Abridged applications would need to reference UK authorised products. However, this would include Centrally Authorised Products that had been converted to UK MAs and also unconverted Centrally Authorised Products granted before Brexit.
  • Incentives for orphan medicines will be offered, including fee refunds and waivers, and a 10-year exclusivity period. The EU’s pre-marketing orphan designation will not be replicated, as a separate UK designation is not seen as providing a substantial additional incentive for developers.
  • Data exclusivity and incentives for paediatric investigation plans and data exclusivity will largely replicate the current EU legislation, at least initially.
  • New UK-specific legal presence requirements will be introduced for holders of marketing authorisations and Qualified Persons (QPs).[2]
  • Arrangements for recognition of QP certification from EU countries are planned. Wholesalers will need to familiarise themselves with the details of this system as there are specific requirements designed to ensure public safety.
  • Some elements of the Falsified Medicines regime will fall away, as the UK is unlikely to have access to the central EU data hub recording dealings with individual packs of medicines. The future of this regime in the UK will be evaluated.
  • The UK plans to permit ongoing parallel importation of medicines authorised elsewhere in the EU where the MHRA can satisfy itself that the imports are essentially similar to a UK-authorised product. Parallel import licence holders will have to comply with new requirements such as establishing a UK base.

Medical devices

Plans for the future regulation of medical devices are less well developed. Further consultation will be carried out before changes are made. Importantly, the UK intends to track the implementation of the new EU laws on medical devices and in vitro diagnostic medical devices due to applying from May 2020 and May 2022.

The guidance recognises that UK Notified Bodies will lose their status under EU legislation in the event of no-deal. Products they have certified will no longer be validly marketed.

The UK will take steps to minimise short term disruption by continuing to allow marketing of devices in conformity with the EU legislation and also those certified by UK Notified Bodies. It will continue to recognise existing clinical investigation approvals and will not require labelling changes. New medical devices will need to be registered with the MHRA, although grace periods of up to 12 months after Brexit day are provided to give manufacturers time to comply.

Clinical trials

Much of the clinical trials system operates nationally and can continue, and the UK will continue to recognise all existing approvals. For new trials, a sponsor or legal representative could be based in the UK or in-country on an approved list – initially including all EU and EEA countries.

The UK would no longer have access to the European regulatory network for clinical trials, and pan-EU trials will presumably require an EU-based sponsor or legal representative.

The UK intends to align with the new EU Clinical Trials Regulation to the extent that it can. This is unlikely to include access to the EU clinical trials portal, but a new UK clinical trials hub will be introduced to provide a similar central information resource for UK trials.

Scenario planning for your business

These regulatory changes form just part of the picture for life science businesses, many of whom must also contend with a range of other issues. A number of our clients are already planning for different scenarios. Our experience indicates that the issues needing consideration to fall into the following categories:

  • Regulatory
  • IP
  • Contracts
  • People
  • Establishments & Structures

[1] Centrally Authorised Products are those which have been through the European Medicines Agency’s approval process resulting in a single approval for the whole EU.

[2] A Qualified Person is an experienced professional responsible for certifying that medicines comply with applicable legal requirements. 

By Isabel Teare, Senior Legal Adviser

Source: Mills & Reeve

How Scottish universities are leading medical device manufacturing

As health-tracking apps become more common and medical devices get ever-smaller, data is powering the next generation of breakthroughs.

 Read More

Laser fabricated fibre-optic probe for biomedical tissue optical biopsy applications (Credit: C. Ross)

Laser fabricated fibre-optic probe for biomedical tissue optical biopsy applications (Credit: C. Ross)

And The Medical Devices Manufacturing Centre (MDMC) is now bringing together a wide range of Scottish expertise to support innovative new ideas in this area.

“It’s about using university expertise to help companies developing these devices to get from concept to commercialisation – from the idea to the market-place,” says Professor Duncan Hand, of Heriot-Watt’s School of Engineering and Physical Sciences, where the MDMC – a collaboration involving Heriot-Watt, and Edinburgh, Glasgow, and Robert Gordon universities – is based.

“We want to create practical and commercially viable devices; they might be sensors linked to mobile phone apps that deliver vital health information, right through to tiny devices that can be used by surgeons inside the body.”

Data is vital to the process in two ways – in manufacturing devices and providing vital insights.

“Manufacturing is very data-driven, because the whole process needs to be controlled and extremely reliable,” explains Prof Hand. “If you want a high-quality manufacturing process, you need to acquire data to monitor both the overall process and specific parts of it, and relate that back – so we can tell when things might be going wrong, to modify the process and make it better. This is particularly crucial when manufacturing low to medium volumes and bespoke (personalised) items to ensure they are ‘right first time’.

“Data associated with that process control is very important.”.

Data is also vital in deriving insights from health monitoring sensors, says Hand: “Sensors might monitor heart rate or oxygen levels, for example, and they are linked to a mobile phone app. Taking data outputs from various sensors and analysing this over a long period of time is a crucial indicator of change and can deliver real health insights.”

The MDMC can also benefit from Heriot-Watt’s expertise in laser technology.

“The laser is an exemplar, highly flexible manufacturing tool increasingly used in making very small and high-precision objects,” says Hand. “This is where we will see crossover with medical devices – and again, data is really important in getting things right in manufacturing very small objects. In many ways, lasers are the ideal data-driven tool.”

There are also close links to Heriot -Watt’s world-leading work at the Edinburgh Centre of Robotics, the foundation of the new National Robotarium, due to open on the Heriot-Watt campus in 2022.

“When we look at medical devices, one key area is how we can manufacture and deploy miniature devices in the body to help surgeons to perform high-precision operations; the link with robotics is clearly important in developing such devices,” says Hand.


The National Robotarium and the MDMC – and all of Heriot-Watt’s work in these and related areas – have industry partnership at the heart of their mission. “We aim to translate research to support innovation and help to bring new products to market,” says Hand. “At the MDMC, this means supporting better treatments and better health outcomes.

“The Centre is an extension of what we’ve been doing very well over many years. We want to build on existing relationships, bring in new companies and develop new partnerships.”

One example is spin-out company IntelliPalp Dx, which aims to revolutionise prostate cancer screening by developing a system to provide more accurate early-stage testing, leading to reduced patient anxiety and more efficient diagnosis.

“The idea is to discern the relationship between changes in the structure and stiffness of the prostate and link this to the likelihood of cancer being present,” says Heriot-Watt Professor of Materials Engineering, Bob Reuben.

“Essentially, we are aiming to emulate and enhance the human sense of touch [palpation] through a mechanical device to bring greater precision and objectivity.”

Initial results have been successful and it is hoped the same principles could be applied to other forms of human tissue assessment.

Prof Hand says laser technology can help in areas like this by manufacturing ‘miniaturised’ devices: “It can readily reduce the current design down to 2-3 millimetres to allow use in other parts of the body, via the working channel of an endoscope. Data is also crucial in projects like this, to analyse results from the devices and make judgements on treatment.”

The University’s work also includes developing ultra-fast laser tools for surgeons to remove cancerous tissue, says Hand: “This includes neurosurgical work, to remove the edges of brain tumours, for example, and requires really high-precision instruments. That’s one area where robotics, lasers and surgical tools can work together closely – and again, where data is needed to deliver the highest possible precision in the instruments.”.

The Medical Device Manufacturing Centre is jointly funded for three years (initially) by the European Regional Development Fund managed by Scottish Enterprise, Edinburgh & South East Scotland City Region Deal, and the four universities.

Source:   The Scotsman

Why the EU should put innovation at the centre of its recovery plan

Europe is facing an unprecedented crisis and as a consequence a threat to its unity. The health, social and economic consequences of the Covid-19 outbreak have been felt very differently across the Union.

 Read More

This is exacerbating existing disparities between the member states. The EU’s recovery strategy should therefore not just repair the damage done, but also actively prevent further divergence in the Union.

In July, European leaders agreed on a recovery package, which European Council President Charles Michel presented to the world with the words “Europe is strong, Europe is robust, and above all, Europe is united”. But does the deal indicate that Europe is strong, robust and united? We believe that our leaders have missed a crucial disparity which needs to be addressed to ensure long term positive convergence within the Union: the innovation divide, namely the disparity in the ability of member states to generate new ideas and to translate them into economic growth and prosperity. Instead of investing in innovation to address this divide, innovation spending has been cut by EU leaders.

This was a surprising outcome because innovation has been put forward by several member states and the Commission as a crucial part of the recovery in Europe. And rightly so. Innovation is the only way through which we will eventually defeat Covid-19 by developing a vaccine and effective treatments. Innovation is also crucial in addressing the twin-transitions Europe was already facing before the pandemic: the digital revolution and becoming a sustainable continent.

However, most crucially, innovation is regarded as a way to reinvigorate economic growth. One could even argue that for Europe, innovation is the key driver for sustainable economic growth as it has been found to fuel two-thirds of economic growth in Europe and up to 30% of productivity growth in the member states.

However, Europe also has an innovation deficit when compared to its global competitors and the innovation divide is one of the causes. The recovery from the Covid-19 induced crisis should address this deficit because it is a crucial factor in slowing Europe’s sustainable growth. Unfortunately, our leaders did not recognise this crucial role of innovation in their deal.

In theory, economic growth can be generated by increasing labour force participation (‘more people in work’) or by increasing labour productivity (‘creating more value per employee’). Before Covid-19, unemployment in Europe was at historic low levels. These levels have risen (and are expected to rise) due to the outbreak, but after they rebound, increased labour force participation will not drive economic growth in Europe sustainably.

This means we need to increase labour productivity for sustainable growth. Increasing labour productivity can be achieved first and foremost by increasing the capital intensity in an economy, in other words more and better machines. However, following the law of diminishing returns, we can expect that for the already highly industrialised European economy, the added value of those investments will be low and will become lower over time.

Therefore, Europe will need to rely on other factors to foster labour productivity growth – and innovation has the capacity to carry a large share of that burden

. From 2010 to 2016 it was responsible for about 60% of labour productivity growth. Therefore, the ability to innovate, to turn knowledge into new or increased economic activity, is crucial for recovery and the return of sustainable growth in Europe. This makes the innovation divide a central issue for our common recovery.

How to tackle the divide

The causes of the innovation divide are diverse and could be either addressed or deepened by how we shape Europe’s recovery. For example, having attractive research systems, securing sufficient public and private investments, and an innovation-friendly policy environment are among the aspects where the innovation divide is most pronounced in Europe (see Figure 1) and our leaders could have ensured that the recovery package addressed these issues.

Figure 1: Performance groups: innovation performance per dimension

Credit: European Union

For example, the Recovery and Resilience Facility will supply cheap money for public investments through national recovery plans in line with the European Semester. This could be a perfect means to shore up national research systems and encourage member states to make their policy environment more innovation friendly. This would require a direct instruction to the member states to address these issues in their national recovery plans.

Unfortunately, the EU’s leaders did not include anything along those lines in their deal. To ensure that the facility is working towards closing the innovation divide, the European Parliament as co-legislator could specify in article 16.3 of the Regulation on the Facility that the national recovery plans have to contribute to making national economies more innovative.

Another way to address the innovation divide is increasing pan-European innovation collaboration. Research suggests that collaboration with partners from knowledge-intensive regions can be a key driver for improving the innovation capacity of a region. It is precisely this collaboration that should be fostered in the recovery over the coming years, both by EU action and by national initiatives.

National governments could invest more in fostering science and innovation collaboration with other member states as part of their international science and innovation policies. Here the initiatives of the Czech government since its adoption of a new innovation strategy might serve as an example. The European Parliament has no direct role in this, but MEPs can use their personal positions to encourage it. What the Parliament could also do is advocate that spending under national recovery plans for the Recovery and Resilience Facility is partly invested in cross-border innovation collaboration. At the EU level, the Commission proposed additional investment in the collaborative pillar of Horizon Europe as part of the recovery plan to enable this. However, EU leaders decided to cut this budget. Horizon Europe was cut by €13.5bn of which about €9bn was expected to go to collaborative research projects.

The European Parliament has already indicated that it cannot accept these cuts, but in the past the Parliament has mostly put its weight behind additional funding for the ‘Widening Participation’ programme to address the divide. However, this programme is aimed at increasing participation in the Framework Programme by researchers from ‘widening’ countries, rather than closing the innovation divide. The innovation divide and participation in the Framework Programme do not overlap fully and so far the widening efforts have not been proven to contribute to any closing of the innovation divide. Therefore, it would be more productive if the Parliament were to focus on restoring collaborative research budgets.

Ever since the Lisbon Strategy of 2000, we have known that Europe’s future prosperity and competitiveness is tied to its ability to innovate. The recovery from the crisis induced by Covid-19 represents a perfect opportunity for Europe to tackle one of the key issues to strengthen that ability: the innovation divide. Unfortunately, the deal between EU leaders on the next budget and the recovery package failed to take this opportunity. Our hope now rests on the wisdom and political power of the European Parliament. With the right priorities, the Parliament could truly be the guardian of our shared European future.

Source:   The London School of Economics

Skin Analytics raises £4M Series A to use AI for skin cancer screening

Skin Analytics Teledermatology Project_Clinical Photography (5) r…

 Read More

Skin Analytics, a U.K.-based startup that has developed a skin cancer screening service that uses artificial intelligence, has raised £4 million in Series A funding. The round was led by Hoxton Ventures, with participation from Nesta and Mustard Seed Ventures.

Skin Analytics  says it will use the injection of cash to expand its focus to the U.S. after it was awarded the “Breakthrough Device Designation” by the FDA as part of a programme designed to fast-track new technologies that can have significant impact on the nation’s health.

It also will continue forging partnerships within the U.K.’s national health service, following the launch of what it claims was the world’s first “AI-powered” clinical pathway in conjunction with University Hospital Birmingham.

Skin Analytics offers a CE marked medical device that studies suggests is able to identify skin cancers, pre-cancerous and benign lesions “to the same level as a dermatologist.” The idea is to enable health systems and insurers to increase dermatology capacity by reducing the burden of diagnosis for dermatologists.

“At its most simple, skin cancer is the world’s most common cancer and incidence is increasing around the world,” says Skin Analytics founder Neil Daly. “Overlay that with the fact there is a global shortage of dermatologists and we have a real challenge already with how we identify and deal with skin cancer.”

Daly says that Skin Analytics has developed a clinically validated AI system that can identify not only the important skin cancers, but the pre-cancerous lesions which can be treated by GPs and a range of benign lesions. “We can do that using a low-cost attachment and a smartphone, allowing us to put this into innovative patient pathways either at GP practices or in hospitals,” he says.

“By using our service, we can reduce the number of patients who end up in hospital by 40-60%, depending on where our technology is used. That [means]… we can reduce the demand on our scarce dermatology resources, freeing them up to focus on other patients such as the inflammatory skin disease patients who often wait months for appointments. We can also reduce the cost of skin cancer, freeing up that money to be reassigned to improving care elsewhere.”

Because skin cancer is such a large problem, coupled with advances in AI, Daly notes that there were initially many companies working in this space, seeing an explosion of competitors in 2014 with around 50 companies in the field. “All but three or four are gone now as the reality of how complex the technology is and how challenging operating as a clinical company hits home,” he says, before adding that there appears to be another wave of competitors surfacing.

“In reality, we’ve spent so much time learning from our mistakes in developing our AI, this is one of our main points of difference,” cautions the Skin Analytics founder. “It is too easy to get started and think you’ve made a great algorithm, but when you test it in the real world — and you can only do this with a prospective clinical study — the performance just isn’t there. Not only have we done that, but we use our research strategy to ask questions that give us the data to continue to improve our algorithms. There is no shortcut for this, you need to test, improve and repeat.”

Another key differentiator is that you can’t ‘fake it until you make it’ in a highly regulated industry and the processes that come with that. “You have to build them into the fabric of your company and it’s slow and painful,” adds Daly. “Medical device companies have to find a way to innovate quickly within a safety critical environment and I’m very proud of the way our team has built that ability, and continues to do so.”

Source: TechCrunch

Scientists create a drug that ‘repairs damage to the brain and spinal cord’ in a potential breakthrough for paralysed patients and Alzheimer’s sufferers

 Read More

A drug created by British scientists could repair damage to the brain and spinal cord by improving messaging between cells.   

Scientists led by the MRC Laboratory of Molecular Biology in Cambridge created a synthetic version of a protein known as Cerebellin-1 that links brain messaging neurons together.

The compound, called CPTX, acts like a ‘bridge’ where connections have been lost due to damage or illness. 

CPTX was able to repair function in both laboratory-grown cells and in mice with neurological deficits that occur in a similar fashion in humans. 

Results of the compound in mice and cells grown in the lab were described as ‘striking’, improving movement coordination and memory. 

It offers hope of new therapies for a range of devastating conditions, from Alzheimer’s disease and epilepsy to paralysis from a car accident.

The greatest impact was seen in mice with spinal cord injuries, in which motor function returned for at least seven to eight weeks after just a single injection of CPTX.

Researchers managed to make their compound CPTX form bridges between two sides of a broken nerve connection, allowing electrical signals to pass through again and restoring movement in lab tests and in disabled mice (Pictured: A broken connection on the left, and one repaired with CPTX on the right)

Researchers managed to make their compound CPTX form bridges between two sides of a broken nerve connection, allowing electrical signals to pass through again and restoring movement in lab tests and in disabled mice (Pictured: A broken connection on the left, and one repaired with CPTX on the right)

The scientists found that mice had better control of their movement and there was more electrical activity in the brain and spine after they had been injected with the CPTX compound

The scientists found that mice had better control of their movement and there was more electrical activity in the brain and spine after they had been injected with the CPTX compound


CPTX combines various properties of different organising proteins.

These ‘synaptic organising proteins’ naturally in the human body make sure synapses are formed and reconfigured whenever necessary. 

They are essential to help establish the communication network that underlies all nervous system functions because they are like ‘junctions’ for which nerve signals have to pass through from one cell to the next.

The researchers likened the production of CPTX to ‘cutting and pasting’ information from the internet. In effect, they took structural elements from different ‘organiser molecules’ to make a unique one.

Where two neurons meet, either in adhesive contact or actually in synaptic connection, CPTX links to specific molecules on the surfaces of both involved cells.

It triggers the formation of new synapses or strengthens already existing ones. 

It also showed to increase the density of ‘dendritic spines’, tiny bulges in the cell’s membrane that are essential for establishing synaptic connections.

Cerebellin-1 and related proteins are known as ‘synaptic organising proteins’  make sure synapses are healthy.

They are essential for establishing the network that underlies all movements and functions in the body.

In the early stages of Alzheimer’s, a brain disease that affects 850,000 people in the UK and 5.7million in the US, and other neurodegenerative disorders, synapses deteriorate and are lost forever.

This eventually causes neurons to die, leading to the classic symptoms of confusion, trouble understanding things and memory loss.

The same happens with spinal cord damage, which could be the result of an car crash, for example. 

It interrupts the constant stream of electrical signals from the brain to the body and can lead to loss of movement, sensation, spasms, bladder and bowel control or paralysis.

An estimated 50,000 Britons and 290,000 Americans are living with a spinal cord injury. 

Researchers led by Dr Radu Aricescu, a neuroscientist at the MRC Laboratory of Molecular Biology in Cambridge, wanted to see if they could create an artificial version of Cerebellin-1. 

Working with colleagues in Germany and Japan, Dr Aricescu’s team worked to ‘cut and paste’ structural elements from different organiser molecules to generate new ones.

This led to CPTX, described today in the journal Science. 

Dr Aricescu said: ‘Damage in the brain or spinal cord often involves loss of neuronal connections in the first instance, which eventually leads to the death of neuronal cells.

‘Prior to neuronal death, there is a window of opportunity when this process could be reversed in principle.

‘We created a molecule that we believed would help repair or replace neuronal connections in a simple and efficient way.’

He added: ‘We were very much encouraged by how well it worked in cells and we started to look at mouse models of disease or injury where we see a loss of synapses and neuronal degeneration.’

The compound, called CPTX, acts like a ‘bridge’ where connections have been lost due to damage or illness. Pictured is an illustration of CPTX (in orange) and how it is like a bridge between nerve cells

Experiments found CPTX had a remarkable ability to organise neuronal connections in lab conditions.

The researchers then tested its effect in mice, finding ‘striking’ results of restored connections and improvements in memory, co-ordination and movement tests in all models. 

Mice genetically engineered to have poor muscle coordination, also known as cerebellar ataxia, were included in the study.

The condition can occur in many human diseases, all caused by damage or loss of the synapses, causing patients to suffer problems with balance, gait and eye movements.

Researchers watched the lab rodents’ neuronal tissue repair itself after the molecule was injected into their brains. It also boosted motor performance.

Encouraged by the success, they then tried the treatment on other mouse models of neuronal loss and degeneration – including Alzheimer’s disease and spinal cord injury.

They saw CPTX increased the ability of synapses to change, which is vital for storing memories. This ability is lost in Alzheimer’s, when the ability to remember past events or how to do daily tasks declines over time.

Co author Professor Alexander Dityatev, of the German Center for Neurodegenerative Diseases, Bonn, who has been investigating synaptic proteins for years, said: ‘In our lab we studied the effect of CPTX on mice that exhibited certain symptoms of Alzheimer’s disease.

‘We found that application of CPTX improved the mice’s memory performance.’ 

But the greatest impact in spinal cord injury where motor function was restored for at least seven to eight weeks following a single injection into the site of injury.

In the brain, the positive impact of injections was observed for a shorter time, down to only about one week.

But the researchers are confident they can rectify this and are now developing new and more stable versions of CPTX so that it has long lasting effects.

Professor Dityatev said: ‘CPTX could be the prototype for a new class of drugs with clinical potential.

‘Much of the current therapeutic effort against neurodegeneration focuses on stopping disease progression and offers little prospect of restoring lost cognitive abilities.

‘Our approach could help to change this and possibly lead to treatments that actually regenerate neurological functions.’ 

A lot more work is needed to find out if the findings in mice are applicable in humans. But the team are excited by the potential implications for a host of disorders associated with reduced neuronal connectivity. 

Dr Aricescu said: ‘There are many unknowns as to how synaptic organisers work in the brain and spinal cord, so we were very pleased with the results we saw.

‘We demonstrate we can restore neural connections that send and receive messages, but the same principle could be used to remove connections.’

This would benefit patients with epilepsy, for instance. Although the researchers did not say this themselves.

Epileptic seizures are bursts of electrical activity in the brain that temporarily affect how it works.  

Dr Aricescu added: ‘The work opens the way to many applications in neuronal repair and remodelling. It is only imagination that limits the potential for these tools.’ 

Professor Dityatev said the credit for the experimental drug ‘goes to our UK partners’.

But added: ‘We are far off from application in humans.’    

Source: Daily Mail

Why innovation needs to be for the many not the few

Innovation is critical to creating a competitive and productive economy…

 Read More

Over the last few months we have seen many reports purporting to be responding to the challenges that will be faced by the UK following the Covid-19 pandemic.

In addressing this “new normal” some have, frankly, stretched incredulity with their claims of revolution in society, the workplace and the economy as a whole.

However, when an organisation such as Nesta – one the UK’s leading think tanks – brings together its thoughts on how innovation could and should change to create a more equitable and productive economy going forward, then it is worth an hour of anyone’s time (especially those in working in politics and policy) to read through its conclusions.

Its report, Innovation after Lockdown — using innovation to build a more balanced, resilient economy —  certainly hits the mark with a number of observations and, more importantly, recommendations that need to be considered carefully at the highest levels of government.

The most important point is that whilst we all know that innovation is critical to creating a competitive and productive economy, this shouldn’t be about “business as usual” especially as increasing investment in research, development and innovation has not resulted in a more balanced economy across the UK’s regions.

Yet again, the point is made that whilst places such as London and the south east of England are amongst the most productive parts of Europe, others  — such as Wales — remain relatively poor and have even been overtaken by regions that were formerly part of communist East Germany.

To reduce these inequalities, Nesta argues that a number of changes need to be made to support economic recovery.

First of all, the policy of “picking” technology winners need to be abandoned in favour of an approach that champions innovation in less ‘sexy’ sectors such a retail, hospitality and social care, all of which have been hit hard during the last five months.

By moving away from considering that innovation is only relevant in a small number of sectors to one that addresses the needs of a wider range of industries that employ a large number of people could have a major impact on the overall productivity of the UK economy.

This could include increased investment not only for high technology firms as has been the norm for many years but also for low productivity firms and sectors to adopt basic technologies and management practices. In particular, there should be greater support for firms that want to start using digital tools to improve their business and for those that want to introduce automated processes in their business in a way that improves jobs rather than replacing them.

Secondly, and as discussed in this column many times over the last sixteen years, there needs to be a realisation that public expenditure on research and development can no longer be concentrated on London, Oxford and Cambridge (which get 52% of investment despite having only 37% of population).

Instead, there needs to be a greater levelling up across the UK so that nations, regions and even individual city-regions can get access to the resources they need to focus on their own innovation priorities.

This could range from a complete devolution of innovation funding to the creation of regional offices of UK Research and Innovation – the main research body for the UK economy – to ensure better local investment.

The fact that much of the recent additional funding to be released by Innovate UK to support greater investment to deal with the challenges of Covid-19 will mainly go to those firms that have previously received funding means that regions such as Wales – where the business community has always received a proportionately lower level of grants than it should have – will not benefit to the same extent as more prosperous regions.

Finally, a range of studies have consistently shown that it is not only market forces that create innovation that have a massive economic and societal impact.

Given this, the UK Government could and should adopt an infrastructure that maximises public benefit and minimises potential harms. This should include a commitment to mission-led innovation policy and more funding for innovation that directly tackles societal problems, including those that are a direct result of the recent pandemic.

Therefore, we all know that innovation has the potential to make a real difference to the economic performance of nations and regions. However, Nesta’s report points out some major issues with the current approach which focuses on a narrow set of sectors and places.

Certainly, this cannot continue if the UK Government is committed to levelling up prosperity across the country and, more importantly, is serious about a more competitive post Brexit economy as we move out of recession

Source: Business Live

3D printing innovations deliver medical breakthroughs for Veterans

VA was an early adopter of 3D printing, using the technology for years to promote health care innovation and address individual Veteran health care needs.

 Read More

The benefits of 3D printing are limitless—from individually customized care, such as creating hand and foot orthotics, prosthetic limbs, and reconstructive surgery, to more groundbreaking applications, such as the ability to accurately replicate a patient’s heart, lung, spine, or aortic valve. Clearly, it has had a profound impact on Veterans’ lives.

VA’s early investment in 3D printing technology has allowed the Department to innovate and improve Veteran care on an ongoing basis and permitted VA staff to quickly apply the technology to aid in VA’s COVID-19 response.

Supports VA clinicians and Veterans nationwide during COVID-19

3D printing has proven essential to medical staff treating patients on the front lines of the COVID-19 pandemic. In response to the pandemic, VA coordinated an MOU for open-source medical products with the Food and Drug Administration (FDA) and the National Institutes of Health (NIH). Additionally, VA partnered with America Makes to rally health care providers and 3D printing organizations into rapidly innovating face mask designs during the pandemic.

Two VA challenges—the Fit to Face Mask Design Challenge and the COVID-19 Maker Challenge—called on innovators and designers to address problems health care workers and first responders encountered while using face masks.

For Beth Ripley, MD, Ph.D., Director, VHA 3D Printing Network and Chair, VHA 3D Printing Advisory Committee, COVID-19 raised awareness that the strength of the medical centers’ response during the pandemic existed in the collaboration and innovation of the teams.

“We are non-siloed, integrated and collaborative,” she said. “Everyone jumped on board and worked across sites to print face shields and ensure they were distributed to the appropriate locations.”

Transforming a computer file into anatomical precision

The impact 3D printing can have on a Veteran’s life and well-being is significant. Imagine your physician discovered a huge tumor wrapped around your ribs and growing on your lung. Using a 3D-printed model derived from computerized tomography (CT) or magnetic resonance imaging (MRI) scans and sophisticated computer software, your medical team can hold the model of your anatomy in their hands. Using the model as a visual aid for pre-surgical planning, they can see details they may not see in two-dimensional imaging. Models also help clinicians determine whether you can avoid an invasive surgical procedure and painful recovery.

Veterans across the country benefit from these custom 3D printed health care solutions. They receive better health outcomes by being more informed and in control of their health care.

“We have a perfect milieu of impassioned clinicians, health care providers, and research and development staff driving us forward to make things happen for the Veteran,” said Dr. Ripley. “This 3D printing technology is all about empowering our frontline staff and patients to advocate for what they need and then to build it.”

Building One Layer at a Time

Building solutions that meet unique Veteran challenges is a hallmark of VA’s Office of Information and Technology’s (OIT) support for VHA’s 3D printing transformational technology. Nick Bogden, Enterprise Design Pattern Lead in OIT’s Enterprise Program Management Office, leads a five-person team that delivers architecture and design innovation to achieve VHA-wide 3D printing capabilities.

When the 3D Printing Advisory Committee needed to increase use and access to the limited number of stand-alone 3D printers at VA hospitals two years ago, they reached out to Bogden and his team. The result? 3D Printing Enterprise Design Patterns (EDP). The EDP operational product provides a framework of capabilities and constraining principles to aid in the development, acquisition, and implementation of IT systems and services for 3D printing.

“Our focus is to translate the requirements and business needs of the Committee from a technology perspective and furnish an enterprise solution,” Bogden explained. Network design and security, cloud-based 3D printing services, and data security are the three key EDP pillars fundamental to advancing the current 3D printing landscape at VA.

Also supporting enterprise efforts is the VHA 3D Printing Network. Launched in 2017, it is the first and largest integrated hospital-based 3D-printing network in the country. Today, the network leverages resources, expertise, and lessons learned across 33 VA medical centers (VAMCs). More importantly, it supports VHA clinicians as they care for and treat Veterans every day.

Network wide

Collaboration and teamwork are distinguishing characteristics of OIT’s support of 3D printing advancements. Currently, Mr. Bogden’s team is developing and writing standards that describe how the 33 VA hospital sites will install multiple printers on the network. As part of the Enterprise Printer Baseline Scrum, the team addresses the business impact of creating a centralized standards document to assist local IT staff in connecting and implementing 3D printers at medical centers across the country.

Source: US Department of Veteran Affairs

Deep learning (AI) – enhancing automated inspection of medical devices?


 Read More

Integrated quality inspection processes continue to make a significant contribution to medical device manufacturing production, including the provision of automated inspection capabilities as part of real-time quality control procedures. Long before COVID-19, medical device manufacturers were rapidly transforming their factory floors by leveraging technologies such as artificial intelligence (AI), machine vision, robotics, and deep learning.

These investments have enabled them to continue to produce critical and high-demand products during these current times, even ramping up production to help address the pandemic. Medical device manufacturers must be lean, with high-speeds, and an ability to switch product variants quickly and easily, all validated to ‘Good Automated Manufacturing Practice’ (GAMP). Most medical device production processes involve some degree of vision inspection, generally due to either validation requirements or speed constraints (a human operator will not keep up with the speed of production). Therefore, it is critical that these systems are robust, easy-to-understand and seamlessly integrate within the production control and factory information system.

Deep learning

Historically, such vision systems have used traditional machine vision algorithms to complete some everyday tasks: such as device measurement, surface inspection, label reading and component verification. Now, new “deep learning” algorithms are available to provide an ability for the vision system to “learn”, based on samples shown to the system – thus allowing the quality control process to mirror how an operator learns the process. So, these two systems differ: the traditional system being a descriptive analysis, and the new deep learning systems based on predictive analytics.

Innovative machine and deep learning processes ensure more robust recognition rates. Medical device manufacturers can benefit from enhanced levels of automation. Deep learning algorithms use classifiers, allowing image classification, object detection and segmentation at a higher speed. It also results in greater productivity, reliable identification, allocation, and handling of a broader range of objects such as blister packs, moulds and seals. By enhancing the quality and precision of deployed machine vision systems, this adds a welcome layer of reassurance for manufacturers operating within this in-demand space.

Deep learning has other uses in medical device manufacturing too. As AI relies on a variety of methods, including machine learning and deep learning, to observe patterns found in data, deep learning is a subfield of machine learning that mimics the neural networks in the human brain by creating an artificial neural network (ANN). Like the human brain solving a problem, the software takes inputs, processes them, and generates an output. Not only can it help identify defects, but it can, as an example, help identify missing components from a medical set. Additionally, deep learning can often classify the type of defect, enabling closed-loop process control.

Deep learning can undoubtedly improve quality control in the medical device industry by providing consistent results across lines, shifts, and factories. It can reduce labour costs through high-speed automated inspection. It can help manufacturers avoid costly recalls and resolve product issues, ultimately protecting the health and safety of those towards the end of the chain.

AI limitations

However, deep learning is not a silver bullet for all medical device and pharmaceutical vision inspection applications. It may be challenging to adopt in some applications due to the Food and Drugs Administration (FDA)/GAMP rules relating to validation.

The main issue is the limited ability to validate such systems. As the vision inspection solution utilising AI algorithms needs sample data, both good and bad samples – it makes validating the process extremely difficult, where quantitative data is required. Traditional machine vision will provide specific outputs relating to measurements, grey levels, feature extraction, counts etc. which are generally used for validating a process. With deep learning, the only output is “pass” or “fail”.

This is a limiting capability of deep learning enabled machine vision solutions – the user has to accept the decision provided by the AI tool blindly, providing no detailed explanation for the choice. In this context, the vision inspection application should be reviewed in advance, to see if AI is applicable and appropriate for such a solution.


In conclusion, deep learning for machine vision in industrial quality control is now widely available. Nevertheless, each application must be reviewed in detail – to understand if the most appropriate solution is to utilise traditional machine vision with quantifiable metrics or the use of deep-learning with its decision based on the data pool provided. As AI and deep learning systems continue to develop for vision system applications, we will see more novel ways of adapting the solutions to replace traditional image processing techniques.

Source: Med Tech Innovation News


Portrait of the software developer as an artist

Creating applications, data engineering and services across PCs, mobile devices, the cloud backbone and throughout the internet of things (IoT), the developer is the artist, the keyboard is the colour palette and the command line is the painter’s easel and canvas.

 Read More


Art is categorised into a number of genres, such as classical, abstract, post-modern and renaissance. Similarly, software programming falls into various fields, categories and sub-genres, such as waterfall, agile, scrum or pair-based, rapid application development (RAD) and now low-code and no-code (for business people) in the wider spectrum of code creation.

Going deeper, some people would classify different programming types by the various language types, such as declarative, concurrent, data flow, functional, and so on. Others separate them out by target device as desktop, mobile, embedded firmware software development.

To add to the confusion of how to categorise software development, modern development practices are often associated with cloud-native, cloud-first and multicloud approaches.

There are those who will argue that modern software development embraces data-driven, big data insight and makes inherent use of artificial intelligence (AI) and machine learning. Computer Weekly asked a number of industry experts to define what it means to be a modern software developer.

There is a school of thought that categorises modern software development as primarily services-centric and web-scalable, which enables the developer to create code that is capable of being deployed across connected services backbones. Other software developers strive to build reusable components and frameworks that form the fundamental building blocks of new applications.

Today, there is a specialist segment of the enterprise software market that defines itself as application modernisation specialists. Often focused on the migration from legacy, mainframe and pre-cloud applications, the thrust from software tools providers in this space is towards microservices, virtual machines, containers and Kubernetes.

A modern software architecture

Volterra provides a distributed cloud platform to deploy, connect, secure and operate applications and data across multicloud and edge sites. Its CEO and founder, Ankur Singla, thinks microservices will have an increasingly important role to play in the immediate future shape of software application development. Singla says the surge seen with Kubernetes adoption and a selection of other factors are the reasons microservices will become more mainstream in 2020 and onwards.

Read more about modern software development

GitLab looks at what does not rank as a measure of modern development for contemporary forward-thing programmers and code engineers

How should we think about the cloud-native, open-compliant, mobile-first, Agile-enriched, AI-fuelled, bot-filled world of coding and how do these forces now come together to create the new world of modern programming?

“Microservices is a part of Kubernetes’ DNA – it is the primary method by which apps are developed and deployed when using Kubernetes,” he says. “With the rise of Kubernetes, tech players are releasing open source toolkits and frameworks that address microservice challenges and ultimately allow other organisations to adopt them properly.”

As an example, Singla says Microsoft recently launched the open source Dapr project. Microsoft describes Dapr as a portable, event-driven runtime that makes it easy for developers to build resilient, microservice stateless and stateful applications that run on the cloud and edge and embrace the diversity of languages and developer frameworks.  Singla  says a number of startups are also ramping up efforts to address these issues.

Write once, deploy on any device

According to open source web server company Nginx, a modern application is one that supports multiple clients – whether the client is a user interface based on the React JavaScript library, a mobile app running on Android or iOS, or a downstream application that connects to a back-end application through an application programming interface (API).

“Modern applications are expected to have an undefined number of clients consuming the data and services they provide,” says Chris Stetson, chief architect and senior director of microservices engineering at Nginx. “A modern application provides an API for accessing that data and those services. The API is consistent, rather than bespoke to different clients accessing the application. The API is available over HTTP(S) and provides access to all the features and functionality available through the graphical user interface (GUI), or a command-line interface (CLI).”

Looking at the ability of web-scale businesses to develop new software-powered products and services suggests that modern development is typified by a high degree of experimentation and iteration. So-called old world IT does not map to modern businesses that want to emulate the success of the web giants. This means there is now a need for increased customer and user feedback in the software development process, which suggests that applications are being created with user experience (UX) sensitivity plugged into their core DNA.

The goals of modern software development practices are increasingly focused on time to value. Chris Bailey, chief architect for cloud-native solutions, IBM Cloud Paks for Applications at IBM, argues that these practices should not only focus on the ability to deliver software rapidly but, crucially, they need to ensure that software delivers real user and business value.

Multidisciplinary teams

Bailey believes software development teams need to become multidisciplinary and more self-contained, reducing handovers and scheduling dependencies on other teams. He says they should also adopt behaviour-driven development (BDD) and test-driven development (TDD) so that software is based on meeting user needs and quality requirements.

Bailey says software development teams are utilising continuous integration to increase velocity and ensure continuous quality checking occurs as part of the development process. He says they also tend to use continuous delivery with capabilities such as canary releases in order to limit risk and continually validate and enforce resilience.

Bailey believes modern software development practices involve building software in a way that makes the code easy to manage. This means adding consistent health-checking, observability and operational controls, which, he says, makes it easy to manage and operate applications once they are in production.

Managing a service mesh – the experience of Bloomberg’s DevX team

Peter Wainwright is a senior engineer and author with the developer experience (DevX) team at Bloomberg, which develops the tools and processes used by the company’s 6,000-plus software engineers to manage its codebase.

The core back-end has traditionally been composed of a few monolithic binaries, some of which are still in use. This means the development team has taken a hybrid approach to integrating service meshes into its infrastructure, says Wainwright.

“The mega-monoliths are mostly C++ and code from thousands of developers is pulled into them,” he says. “They have a weekly cycle, with release branches that spawn at set times, each the start of a new release cycle – you might call this a ‘monoschedule.’ It is, quite deliberately, not very flexible. Deployment is slow but simple. Everyone knows the social contract to get changes submitted, testing completed and any needed fixes in. It works well, because it is predictable.”

To counter the inflexibility of the “monoschedule”, the Bloomberg team built in feature toggles that allow changes to be flipped without interrupting users’ sessions – per customer, if required.

Describing how the service mesh idea at Bloomberg evolved, Wainwright says: “We first started building service meshes almost as an afterthought. Our services scale very well, up to the hardware’s limits. But they can’t scale further unless we deploy new hardware, and you can’t do that at a moment’s notice. Our customers need us most when market activity peaks.

“Monoliths mean that high-traffic functions are deployed with the same mechanisms and priority as low-traffic ones. As we added more features, the different priorities among teams meant an increasing demand for special cases and exceptions, all of which had to be tracked.

“We found that some analytics scaled better if we extracted them into services and just had the monolith route traffic to them. This solved the impedance mismatch that the monolith created between teams and it brought the added advantages that we could deploy changes more rapidly. Because analyses often query one another, a mesh naturally started to develop.”

Wainwright added: “You might think the biggest challenge here is the migration from monolithic applications to isolated containerised microservices, but it really isn’t. Migrating a routine to a service often was easy. However, as more moving pieces were introduced, we had to invest in our tooling and our culture, too.”

From a technical perspective, Wainwright says that replacing in-process calls with service calls introduces potential latency, timeouts and queueing issues. But there are other, more subtle challenges too, he says. “Service meshes are harder for engineers to reason about. These were problems they’d never had to consider before, let alone solve.”

Software development practices go hand-in-hand with change management and having diverse thinking from a broad base of people. There was a time when software was developed to serve generic business functions, all within a single, monolithic application that ran on a central system and could be accessed via a dumb terminal. These days, the preferred architecture for new software development projects is often highly componentised, where individual building blocks may run on different servers, containers or even split across different clouds.

The front-end application not only needs to run on any device, but developers are encouraged to create user interfaces that engage end-users. From an end-user perspective, the front- and back-end applications need to create good user experience.

Any company can assemble the best development tools and services, but that will not necessarily guarantee success, says Catherine Wong, chief product officer and executive vice-president of engineering at Domo, which specialises in cloud-based business intelligence tools, data visualisation and data integration.

“If it were that straightforward, we’d see a lot more startups succeed,” she says. Wong believes there are, of course, a multitude of reasons why building and scaling software is so challenging. While writing millions of lines of code and distributed bits and bytes are absolutely important, success requires a team effort.

“For the majority of us, software development is a team sport,” she says. “Our teams have long had crisply defined roles, like engineers and all their specialisations, architects, product managers, quality assurers, designers, project managers and technical writers. Those are still relevant functions, but how we enable diversity of thought and experience, as well as how we cross-train for increased empathy and better communication among the team, has become more critical than any one job title.”

According to Wong, this focus on diversity dramatically influences the business impact of the software that is being developed, and, more importantly, on a human level, it stretches project and product managers to cultivate an environment of inclusion, innovation and growth. “Over the years, I’ve seen countless examples of how the art of software development and the human elements of diversity and collaboration are what really differentiate a product and its speed of response to the market,” she says.

Source: Computer weekly

Two chipmakers are the best bet on future tech growth, long-time tech investor Paul Meeks says (video)

 Read More

Video Link

One of the world’s largest companies – Apple – has carried the tech sector higher this year.

The iPhone maker has rallied nearly 45% in 2020 and set another record high as recently as Friday, a day after a blowout earnings report.

But, Apple isn’t the only way to play strength in the sector, according to long-time tech investor Paul Meeks who manages the Wireless Fund.

“Probably my favorite ideas, not necessarily in the near term because I can’t predict the near term but over the next couple of years, are two semiconductor-related names,” Meeks told CNBC’s “Trading Nation” on Friday.

His first pick is Micron Technology, a chipmaker that has lagged a broad rally among the semiconductor stocks this year.

“I think they will double their stock price over the next two or three years, whereas some of the marquee tech stocks I think will continue to outperform but they can’t double from here,” said Meeks.

Micron is down 7% in 2020, while the SMH semiconductor ETF has gained 18%.

“Then the other one that I like is semiconductor capital equipment manufacturer ASM Lithography. Those two are probably my favorite ideas for tech over the next couple of years,” he said.

While those are his top ideas, he says any tech investor should at least be exposed to some of the market’s biggest companies including Apple, Facebook and Amazon.

“We’re in a tech world and everything else is just revolving around us in the periphery,” said Meeks. “Perhaps you don’t like those valuations, there’s some that I don’t like at any particular time. But, these all are stocks that are probably over time, if you’re a serious tech investor, must haves.”

Source: CNBC

Medical Device Extractables and Leachables Testing in 2020

KBC contactless NFC wearable payment devices

 Read More

Customers of Belgium’s KBC bank can now add their debit card to their choice of a range of wearables and then use their new ring, watch, keyring or bracelet to make payments at any merchant equipped to accept contactless payments.

Brands supported include Berg, Gemini, K Ring, Laks Pay, Mondaine, Rosan Pay and Tapster.

Customers wishing to take advantage of the new service can visit the bank’s website to view all the options available. Once they have made their choice, they then place an order directly with the supplier.

“The wearable will be delivered to their preferred address,” KBC says. “It is not activated on delivery: only the customer can do that, by following a simple procedure within the secure KBC Mobile environment.

“This ensures that the entire purchase and activation process can be completed safely. Once the wearable has been activated, the customer can use it to make payments.”

The launch follows a year-long pilot project that saw the bank providing 1,000 customers with an NFC-enabled ring, key ring, wristband or watch.

“Their response was so positive that KBC decided to open up payments by wearable to all KBC customers as of the second half of 2020,” the bank says.

‘We were very pleasantly surprised by the number of enthusiastic volunteers who came forward at the beginning of the pilot to test payment with a watch, ring, key ring or bracelet,” says Karin Van Hoecke, KBC Belgium’s general manager for digital transformation.

“We regularly questioned the 1,000 customers who took part in the pilot about their experiences, so that we could make minor adjustments.

“What we learned from this was that the wearable you use to make payments needs to match your personal style. That led us to broaden the offering, so that there are wearables for everyone’s style and budget.

“The volunteers were very satisfied from the outset with the ease of use. A final survey showed they were enthusiastic about continuing to use this way of payment. That’s what prompted KBC to open up the service to all our customers. They too can now enjoy an innovative addition to the various payment options we already offer.”

Key findings from the survey of pilot participants included:

  • Rings, smartwatches and key rings came out as the most convenient wearables with 34% of users preferring the ring, 21% the smartwatch and 18% the key ring.
  • Six out of 10 users had the wearable with them almost always. Half of the users considered convenience to be the most important advantage.
  • Two out of three users said they would certainly or probably consider buying a wearable in the future.
  • Over half would firmly recommend a wearable to relatives and friends.
  • The final survey of users found that the debit card was their most commonly used means of payment, with the wearable in second place and cash bringing up the rear.

“Over 90% of payment terminals in Belgium have been fitted with the necessary technology for contactless payment,” KBC adds.

“During the recent coronavirus lockdown, Belgians have been reaching increasingly for their cards and smartphones.

“After all, contactless payment is an extremely secure and hygienic way to pay for purchases. And making contactless payments using a wearable is even more convenient.”

Source: NFCW

Medical Device Extractables and Leachables Testing in 2020

Nelson DSC01866[1] copy.jpg

 Read More

The world of physics has a foundation built on beautiful universal constants, things like π and ε0, which work their way steadfastly into virtually every aspect of modern life. Ironically, universal constants form the foundation of life for which it is popularly said that the only constant is change. Over the past 5 years, the medical device community has swung from nearly full ignorance of the potential power of chemistry testing to full adoration and acceptance, and now back—in a sense—to a state of scrutiny and skepticism. Regulators, in response to an influx of medical device submissions centered on supporting chemistry data, have increased their knowledge and finesse with the science and have been asking tough questions. In response to this feedback, as medical device chemistry for toxicology (ChemTox) has matured, the overall strategy has changed dramatically on some points.

In this environment, with a new ISO 10993-18, EU Medical Device Regulation (MDR), and the new ISO 21726, it is common for us to hear, “So, what are you guys doing for medical device chemistry testing?” This article provides a high-level answer to that question, which is, in short “everything we can.” The primary goal of ChemTox is to provide data useful for an unambiguous toxicological risk assessment. To meet that requirement, the study must be sufficiently broad in scope and sensitive enough to avoid missing potentially toxic compounds as well as to provide positive identifications. In addition to the foundational scientific requirements, the study must also meet regulatory expectations of completeness on points that labs might have a professional disagreement regarding scientific validity.

Sufficient Breadth in ChemTox

Sufficient breadth in study design is essential to ensure that important classes of compound that might migrate from a medical device aren’t missed in an extractables study. Therefore, extraction solvents covering a range of polarities are required. The static dielectric constant is a good measure of solvent polarity and is often used in pharmaceutical compounding to formulate liquids suitable to dissolve drugs. We know that the bulk polarity of human tissue ranges from that of water (dielectric constant, δ= 80) to slightly less polar than water (in fatty tissue like the brain δ= 43-58).From a clinical perspective, we would expect extraction solvents in this range to be sufficient. In practice, for all devices with prolonged or permanent contact, three solvents are requested by FDA: water (δ= 80), a mid-polar like isopropanol (IPA) (δ= 18.3), and a non-polar like hexane (δ= 2.0).

In addition to ensuring a broad range of compounds are soluble in the extraction matrix, breadth of analytical methods and instrumentation is also required. At a minimum, a study should include a method for volatile organic compounds (VOCs), semi-volatiles (SVOCs), non-volatiles (NVOCs), and inorganic/elemental compounds. A typical extraction and analysis matrix providing sufficient breadth (and meeting current regulatory expectations) is shown below in Table 1.

Analytical Method


Headspace GC/MS
Direct Injection GC/MS







Above: Table 1: Typical Analytical Test Matrix

The Question of Extraction Duration – Is it Exhaustive?

The term “extractables and leachables” for medical devices is an adaptation or extension of the same term that has been applied to pharmaceutical container/closure systems. For patient-contacting medical devices, a true leachables study is impossible because the device leaches into the body—not a drug. Therefore, we seek to conduct a single-phase study that both uses aggressive extraction conditions for the identification of all hazards as well as has an acceptable level of quantification. Using a definition commonly applied in ISO 10993 for other tests, regulators have required that extractables studies be conducted exhaustively for devices with prolonged or long-term contact. This involves serially extracting a device until the amount of material extracted is less than 10% of the initial extraction. In case non-volatile residue (NVR) is used to provide evidence of exhaustiveness as was required in the previous version of ISO 10993-18, the submitter must be prepared for questions regarding why the amounts detected by mass spectroscopy do not align. A better, more complete, the picture is given by serially extracting at 24-hour intervals and analyzing each extract with the full suite of analytical methods.

Sufficient Sensitivity in ChemTox

After identifying the appropriate TTC for toxicological assessment, application of uncertainty factors in setting an AET can be tricky. Historically, labs have relied on two key papers that measured variance in response factors; more recently, regulators have been requesting lab-specific data to support an appropriate uncertainty factor in screening and that the AET be clearly justified.

Providing an Identification of Sufficient Quality

After ensuring the necessary breath in the analytical approach and applying a scientifically sound analytical evaluation threshold, the next step in ChemTox is to identify the detected compounds with a sufficient level of confidence. These identifications, together with the detected concentration of the compounds, define the starting point of the subsequent toxicological assessment. Correct identification of a compound is absolutely critical, as a misidentification might lead to incorrect conclusions on safety and biocompatibility of the tested device. 

Historically it was common practice to solely rely on mass spectral matching. Using this approach, the mass spectrum of investigation is compared with mass spectra stored in mass spectral libraries. The higher the similarity between the two spectra is, the higher the resulting match factor. For those unfamiliar with MS, this may sound like a solid and reliable approach. However, when more experienced, it becomes obvious match spectral matching is only the first step to a confident identification and unreliable when used on its own. 

One of the problems with spectral matching is the availability of libraries containing compounds that are expected to be present in and on medical devices. While for VOCs and SVOCs commercially available libraries such as NIST and Wiley exist, no such library is available for NVOCs. Authorities now expect significant effort on unknown compounds. These efforts may rely on a proprietary database and/or on the experience of mass spectrometrists that manually interpret the obtained spectra. Nelson Labs has invested for decades in building its own library that was gradually built by purchasing analytical standards that were measured and recorded. To date this library contains 1000 VOCs, 3500 SVOCs and 2000 NVOCs, avoiding reporting unidentified compounds as much as possible.

To help a toxicologist understand the reliability of reported compounds, ISO 10993-18:2020 asks to indicate the identification status in chemical characterization reports. Proposed levels for high to low confidence are confirmed, confident, tentative, speculative, and unidentified. The higher the confidence level, the lower the uncertainty to be used in a toxicological assessment. To gain confidence, additional evidence of the identification is needed. Examples are manual interpretation of the mass spectrum by a mass spectrometrist, detection of the compound by another analytical technique, or known use of the compound in the manufacturing of the medical device. Confirmed identifications are especially important in case the reported concentration is close to the permitted daily dose of the compound.


Medical device ChemTox has been a moving target as it has matured. While shifting expectations can be frustrating, studies conducted today provide much more thorough and protective data than just 2 to 3 years ago. It can be expected that things will continue to shift until regulatory bodies reach a consensus on their expectations of these studies and provide guidance on the same. Nelson Labs remains committed to frequent and transparent communication and guidance to both FDA and sponsors we help through this process.

Source: MDDIOnline

How Adopting An Explorers Mindset Can Help You To Lead Innovation

As the pace of change increases, the great companies will be the ones who can explore new opportunities, while running their current business.

 Read More

This ability to live in the two worlds of explore and exploit will be the true test of corporate leadership. Even within business education, how to run an ambidextrous organization will become Management 101.  

What is key for leaders is to understand that they cannot manage their innovation portfolio in the same way that they manage their current business. We expect innovation teams to have a different mindset that is focused on searching, dealing with ambiguity, testing ideas, failing and iterating. We have to set similar expectations for leaders. They also need to lead with a different mindset. 

Leading Exploit

In the existing business, leaders have used to managing teams that plan and execute. As such, they lead their teams by having them outline their plans in detail, create a roadmap for execution and make business cases with clear projections about expected outcomes. 

If the team’s plan, roadmap and business case make sense, leaders will invest in the proposed idea. They will then manage the team by tracking whether it is hitting the milestones as proposed in the plan. Failure to deliver on time and on a budget can have severe career-limiting consequences for employees.   

Such an approach to management can only work in a relatively predictable world where companies know their customers and business model. With disruption everywhere, there are even questions about whether any company’s existing business model is predictable. However, the existing knowledge and experience we have in running our business allow us to make plans with some level of confidence. 

Innovation is a totally different game. This is especially the case when we are working on transformative innovations, where we are creating new products and services for new markets. In this case, you have to take the mindset of someone who is leading a team of explorers who are about to step into the great unknown.

This is a useful metaphor to use. If you have a team that is about to travel to an unknown place, it is not really wise for you to ask them for a detailed plan, roadmap and expected returns. This makes no sense. As a leader in that situation, you have to realize that you are asking people to go out into the world and find out if there is something worth investing in there. 

This shift in mindset will change your leadership style. You are not asking people to go back to a place they have been before several times (i.e. the existing business). You are asking people to go to a place that they have never been before (i.e. new business ideas). Sure there may be some reports from your market research consultants that there is something worthwhile in the new place, but until you try to build a business where you do not really know what will or will not work. 

So if you are a smart investor, you will not invest the whole budget for implementing a business idea on the basis of an explorer’s roadmap and projections. If you do that you are likely to lose all your money or you may never see the explorer again! To lead an exploration team you need to be world-class at doing the following five things: 

  1. Choosing Where To Play: As a leader, you have to provide your teams with clear strategic goals for innovation. You should not just say to your innovation teams, go to work and bring me something cool. You have to be more clear and explicit. Which parts of the business world do you want them to go and explore? Which arenas and emerging trends do you want them to address? This guidance will then form the basis of how you decide which expedition teams to fund. 
  2. Setting The Right Expectations: As the teams go out to explore, you have to be clear what you want them to find out and how you are going to evaluate them when they come back. These expectations have to be reasonable given where they are on the innovation journey. In one case, you may just want a team to find out if there is a real customer need that is worth building a solution for. In another case, you may want them to go and test several solutions to find out what people will pay for. There are several learning goals you could set your teams. As leaders, we have to be clear about our expectations.
  3. Giving Teams The Right Tools:  Given our expectations, we have to make sure that our teams have the right tools and skills to gather the evidence we require to make decisions. There is no point in sending teams out there with the wrong tools or deploying teams that do not know how to design and test business ideas. What you will get in return is poor quality data that will not help you in making informed decisions. 
  4. Providing Just Enough Resources: Even as we give our teams the right tools, we need to ensure that we are giving them just enough resources to explore. In the early stages of exploration, we do not want to over-invest. Rather, we want to make incremental bets using metered funding (i.e. small bets that increase overtime on those teams that are showing evidence of success). If we give our teams too many resources, we can lose the ability to hold them to account. They may even start implementing the wrong ideas. But if they have to come back and make a case to earn their next investment, they will make sure that they gather the right evidence-based on the expectations and criteria you have set. 
  5. Accepting That Anything Can Happen: This is an important mindset for leaders to have. You must be willing to accept that anything can happen. When teams are out exploring, you must expect that a lot of them will come back and tell you that they found nothing (i.e. the expedition stops). Some teams will come back and tell you that they did not find what they were expecting. Instead, they found something else that could be more valuable (i.e. the expedition pivots). A few teams will come back and confirm that there is value to be created where they went to explore (i.e. the expedition perseveres). As a leader, you have to be prepared for all three outcomes and make the right decisions. There should be no negative consequences for any teams, regardless of the outcome. 

Making Decisions

Understanding that exploring for new opportunities differs from exploiting a currently successful business is an important mindset shift for leaders to make. In that same vein, leaders also need to change how they make decisions. In the explorer’s world, we make decisions based on evidence. It makes no sense for leaders to pour endless resources into an expedition to a place they have never seen just because they like the team or because the team are working on their pet project.  All investment decisions must be based on the evidence that teams bring back. This is the only way to succeed as an innovation leader.

Source: Forbes

Silicon Valley has admitted facial recognition technology is toxic – about time

an engineer in beijing works on facial recognition software to identify people when they wear face masks in march this year.

 Read More

So IBM has seen the light on facial recognition technology. On Monday, in a dramatic and surprisingly passionate statement (at least for the CEO of a major tech company), Arvind Krishna called on the US Congress to enact reforms to advance racial justice and combat systemic racism, while announcing that his company was getting out of the facial recognition business.

In his letter, Mr Krishna said that “IBM no longer offers general-purpose IBM facial recognition or analysis software” and “firmly opposes and will not condone uses of any technology, including facial recognition technology offered by other vendors, for mass surveillance, racial profiling, violations of basic human rights and freedoms, or any purpose which is not consistent with our values and principles of trust and transparency. We believe now is the time to begin a national dialogue on whether and how facial recognition technology should be employed by domestic law enforcement agencies.”

Source: The Guardian

Our Behaviour in This Pandemic Has Seriously Confused AI Machine Learning Systems

main article image

 Read More

The chaos and uncertainty surrounding the coronavirus pandemic have claimed an unlikely victim: the machine learning systems that are programmed to make sense of our online behaviour.

The algorithms that recommend products on Amazon, for instance, are struggling to interpret our new lifestyles, MIT Technology Review reports.

And while machine learning tools are built to take in new data, they’re typically not so robust that they can adapt as dramatically as needed.

For instance, MIT Tech reports that a company that detects credit card fraud needed to step in and tweak its algorithm to account for a surge of interest in gardening equipment and power tools.

An online retailer found that its AI was ordering stock that no longer matched with what was selling. And a firm that uses AI to recommend investments based on sentiment analysis of news stories was confused by the generally negative tone throughout the media.

“The situation is so volatile,” Rael Cline, CEO of the algorithmic marketing consulting firm Nozzle, told MIT Tech.

“You’re trying to optimize for toilet paper last week, and this week everyone wants to buy puzzles or gym equipment.”

While some companies are dedicating more time and resources to manually steering their algorithms, others see this as an opportunity to improve.

“A pandemic like this is a perfect trigger to build better machine-learning models,” Sharma said.

Source: The Guardian

Our Behaviour in This Pandemic Has Seriously Confused AI Machine Learning Systems

main article image

 Read More

The chaos and uncertainty surrounding the coronavirus pandemic have claimed an unlikely victim: the machine learning systems that are programmed to make sense of our online behaviour.

The algorithms that recommend products on Amazon, for instance, are struggling to interpret our new lifestyles, MIT Technology Review reports.

And while machine learning tools are built to take in new data, they’re typically not so robust that they can adapt as dramatically as needed.

For instance, MIT Tech reports that a company that detects credit card fraud needed to step in and tweak its algorithm to account for a surge of interest in gardening equipment and power tools.

An online retailer found that its AI was ordering stock that no longer matched with what was selling. And a firm that uses AI to recommend investments based on sentiment analysis of news stories was confused by the generally negative tone throughout the media.

“The situation is so volatile,” Rael Cline, CEO of the algorithmic marketing consulting firm Nozzle, told MIT Tech.

“You’re trying to optimize for toilet paper last week, and this week everyone wants to buy puzzles or gym equipment.”

While some companies are dedicating more time and resources to manually steering their algorithms, others see this as an opportunity to improve.

“A pandemic like this is a perfect trigger to build better machine-learning models,” Sharma said.

Source: The Guardian

US and UK ‘lead push against global patent pool for Covid-19 drugs’

A researcher in Beijing works on an experimental coronavirus vaccine

 Read More

Ministers and officials from every nation will meet via video link on Monday for the annual world health assembly, which is expected to be dominated by efforts to stop rich countries monopolising drugs and future vaccines against Covid-19.

As some countries buy up drugs thought to be useful against the coronavirus, causing global shortages, and the Trump administration does deals with vaccine companies to supply America first, there is dismay among public health experts and campaigners who believe it is vital to pull together to end the pandemic.

While the US and China face off, the EU has taken the lead. The leaders of Italy, France, Germany and Norway, together with the European commission and council, called earlier this month for any innovative tools, therapeutics or vaccines to be shared equally and fairly.

“If we can develop a vaccine that is produced by the world, for the whole world, this will be a unique global public good of the 21st century,” they said in a statement.

The sole resolution before the assembly this year is an EU proposal for a voluntary patent pool. Drug and vaccine companies would then be under pressure to give up the monopoly that patents allow them on their inventions, which means they can charge high prices so that all countries can make or buy affordable versions.

In the weeks of negotiations leading up to the meeting, which is scheduled to last for less than a day, there has been a dispute over the language of the resolution. Countries with major pharmaceutical companies argue they need patents to guarantee sufficiently high prices in wealthy nations to recoup their research and development costs.

Even more fraught have been attempts to reinforce countries’ existing rights to break drug and vaccine company patent monopolies if they need to for the sake of public health. A hard-fought battle over Aids drugs 20 years ago led to the World Trade Organization’s Doha declaration on trade-related intellectual property (Trips) in favour of access to medicines for all, but the US, which has some of the world’s biggest drug companies, has strongly opposed wording that would encourage the use of Trips.

Source: The Guardian

Three firmware blind spots impacting security

Built into virtually every hardware device, the firmware is lower-level software that is programmed to ensure that hardware functions properly.

 Read More


firmware blind spots

As software security has been significantly hardened over the past two decades, hackers have responded by moving down the stack to focus on firmware entry points. Firmware offers a target that basic security controls can’t access or scan as easily as software while allowing them to persist and continue leveraging many of their tried and true attack techniques.

The industry has reacted to this shift in attackers’ focus by making advancements in firmware security solutions and best practices over the past decade. That said, many organizations are still suffering from firmware security blind spots that prevent them from adequately protecting systems and data.

This can be caused by a variety of factors, from simple platform misconfigurations or reluctance about installing new updates to a general lack of awareness about the imperative need for firmware security.

In short, many don’t know what firmware security hazards exist today. To help readers stay more informed, here are three firmware security blind spots every organization should consider addressing to improve its overall security stance:

1. Firmware security awareness

The security of firmware running on the devices we use every day has been a novel focus point for researchers across the security community. With multiple components running a variety of different firmware, it might be overwhelming to know where to start. A good first step is recognizing firmware as an asset in your organization’s threat model and establishing the security objectives towards confidentiality, integrity, and availability (CIA). Here are some examples of how CIA applies to firmware security:

  • Confidentiality: There may be secrets in firmware that require protection. The BIOS password, for instance, might grant attackers authentication bypass if they were able to access firmware contents.
  • Integrity: This means ensuring the firmware running on a system is the firmware intended to be running and hasn’t been corrupted or modified. Features such as secure boot and hardware roots of trust support the measurement and verification of the firmware you’re running.
  • Availability: In most cases, ensuring devices have access to their firmware in order to operate normally is the top priority for an organization as far as firmware is concerned. A potential breach of this security objective would come in the form of a permanent denial of service (PDoS) attack, which would require manual re-flashing of system components (a sometimes costly and cumbersome solution).

The first step toward firmware security is awareness of its importance as an asset to an organization’s threat model, along with the definition of CIA objectives.

2. Firmware updates

The increase in low-level security research has led to an equivalent increase in findings and fixes provided by vendors, contributing to the gradual improvement of platform resilience. Vendors often work with researchers through their bug bounty programs, their in-house research teams, and with researchers presenting their work in conferences around the world, in order to conduct coordinated disclosure of firmware security vulnerabilities. The industry has come a long way enabling collaboration, enabling processes and accelerating response times towards a common goal: improving the overall health and resilience of computer systems.

The firmware update process can be complex and time consuming, and involves a variety of parties: researchers, device manufacturers, OEM’s, etc. For example, once UEFI’s EDK II source code has been updated with a new fix, vendors must adopt it and push the changes out to end customers. Vendors issue firmware updates for a variety of reasons, but some of the most important patches are designed explicitly to address newly discovered security vulnerabilities.

Regular firmware updates are vital to a strong security posture, but many organizations are hesitant to introduce new patches due to a range of factors. Whether it’s concerns over the potential time or cost involved, or fear of platform bricking potential, there are a variety of reasons why updates are left uninstalled. Delaying or forgoing available fixes, however, increases the amount of time your organization may be at risk.

A good example of this is WannaCry. Although Microsoft had previously released updates to address the exploit, the WannaCry ransomware wreaked havoc on hundreds of thousands of unpatched computers throughout the spring of 2017, affecting hundreds of countries and causing billions of dollars in damages. While this outbreak wasn’t the result of a firmware vulnerability specifically, it offers a stark illustration of what can happen when organizations choose not to apply patches for known threats.

Installing firmware updates regularly is arguably one of the most simple and powerful steps you can take toward better security today. Without them, your organization will be at greater risk of sustaining a security incident, unaware of fixes for known vulnerabilities.

If you’re concerned that installing firmware updates might inadvertently break your organization’s systems, consider conducting field tests on a small batch of systems before rolling them out company-wide and remember to always have a backup of the current image of your platform to revert back to as a precautionary measure. Be sure to establish a firmware update cadence that works for your organization in order to keep your systems up to date with current firmware protections at minimal risk.

3. Platform misconfigurations

Another issue that can cause firmware security risks is platform misconfigurations. Once powered on, a platform follows a complex set of steps to properly configure the computer for runtime operations. There are many time- and sequence-based elements and expectations for how firmware and hardware interact during this process, and security assumptions can be broken if the platform isn’t set up properly.

Disabled security features such as secure boot, VT-d, port protections (like Thunderbolt), execution prevention, and more are examples of potentially costly platform misconfigurations. All sorts of firmware security risks can arise if an engineer forgets a key configuration step or fails to properly configure one of the hundreds of bits involved.

Most platform misconfigurations are difficult to detect without automated security validation tools because different generations of platforms may have registers defined differently, there are a long list of things to check for, and there might be dependencies between the settings. It can quickly become cumbersome to keep track of proper platform configurations in a cumulative way.

Fortunately, tools like the Intel-led, open-source Chipsec project can scan for configuration anomalies within your platform and evaluate security-sensitive bits within your firmware to identify misconfigurations automatically. As a truly cumulative, open-source tool, Chipsec is updated regularly with the most recent threat insights so organizations everywhere can benefit from an ever-growing body of industry research. Chipsec also has the ability to automatically detect the platform being run in order to set register definitions. On top of scanning, it also offers several firmware security tools including fuzzing, manual testing, and forensic analysis.

Although there are a few solutions with the capability to inspect a systems’ configuration, running a Chipsec scan is a free and quick way to ensure a particular system’s settings are set to recommended values.

Your organization runs on numerous hardware devices, each with its own collection of firmware. As attackers continue to set their sights further down the stack in 2020 and beyond, firmware security will be an important focus for every organization. Ensure your organization properly prioritizes defenses for this growing threat vector, install firmware updates regularly, commit to continuously detect potential platform misconfigurations, and enable available security features and their respective policies in order to harden firmware resiliency towards confidentiality, integrity and availability.

Record fall in hiring as COVID-19 ‘wreaks havoc’

Record fall in hiring as COVID-19 'wreaks havoc'

 Read More

Hiring activity has fallen to a record low as many employers have imposed recruitment freezes in response to the COVID-19 pandemic, new research has found.

Artificial Intelligence must be regulated to stop it damaging humanity Google boss Sundar Pichai says.

Artificial intelligence must be regulated to save humanity from being hit by its dangers, Google’s boss has said.

 Read More

Google CEO Sundar Pichai speaks during a conference in Brussels on January 20, 2020

The potential damage the technology could do means it is “too important” not to be constrained, according to Sundar Pichai.

While it has the potential to save and improve lives, it could also cause damage through misleading videos and the “nefarious uses of facial recognition”, he wrote in the New York Times, calling on the world to work together to define what the future of AI should look like.

The regulation would be required to prevent AI being influenced by bias, as well as protect public safety and privacy, he said.

“Growing up in India, I was fascinated by technology. Each new invention changed my family’s life in meaningful ways. The telephone saved us long trips to the hospital for test results. The refrigerator meant we could spend less time preparing meals, and television allowed us to see the world news and cricket matches we had only imagined while listening to the short-wave radio,” he said.

“Now, it is my privilege to help to shape new technologies that we hope will be life-changing for people everywhere. One of the most promising is artificial intelligence.

“Yet history is full of examples of how technology’s virtues aren’t guaranteed. Internal combustion engines allowed people to travel beyond their own areas but also caused more accidents. The internet made it possible to connect with anyone and get information from anywhere, but also easier for misinformation to spread.”

Mr Pichai pointed to Google’s own published principles on AI and said existing rules such as the GDPR in the EU could be used as the foundation for AI regulation.

“International alignment will be critical to making global standards work. To get there, we need agreement on core values. Companies such as ours cannot just build promising new technology and let market forces decide how it will be used. It is equally incumbent on us to make sure that technology is harnessed for good and available to everyone,” he said.

He added that the tech giant wanted to work with others on crafting regulation.

“Google’s role starts with recognising the need for a principled and regulated approach to applying AI, but it doesn’t end there. We want to be a helpful and engaged partner to regulators as they grapple with the inevitable tensions and trade-offs. We offer our expertise, experience and tools as we navigate these issues together.

“AI has the potential to improve billions of lives, and the biggest risk may be failing to do so. By ensuring it is developed responsibly in a way that benefits everyone, we can inspire future generations to believe in the power of technology as much as I do.”

Google is one of the world’s most prominent AI developers – its virtual helper, the Google Assistant, is powered by the technology, and the company is also working on a number of other products, including driverless cars, which utilise AI.

Mr Pichai also revealed that Google’s own principles specify that the company will not design or deploy artificial intelligence in some situations, including those which “support mass surveillance or violate human rights”.

Source: The Independent 

Connected healthcare

Connectivity throughout healthcare is yielding huge benefits for the industry and patients alike and is a trend set to continue and accelerate.

 Read More

However, cybersecurity is an issue and while the need for a secure enterprise-level architecture is widely acknowledged, the role played by securely coded devices is easier to ignore, yet vitally important.

While patients and providers benefit from improved operational efficiency derived from the use of real-time data from a wide range of sources there is, however, a downside.

As the number of medical devices networked increases, so does the number of different points (“attack vectors”) accessible to any bad actor looking to manipulate data and cause mischief.

In 2011, the ethical hacker Barnaby Jack shone a spotlight on the issue by using modified antennae and software to demonstrate how it was possible to wirelessly attack, and take control of, Medtronic’s implantable insulin pumps and to command them to administer a fatal dose of insulin.

More recent examples demonstrate that such a direct attack remains a challenge. On August 23, 2017, for example, the Food and Drug Administration (FDA) in the US approved a firmware update to reduce the risk of patient harm due to potential exploitation of cybersecurity vulnerabilities for certain Abbott (formerly St. Jude Medical) pacemakers.

The WannaCry malware attack on the UK’s National Health Service (NHS) is another example from 2017. The malware exploited the Windows implementation of the Server Message Block (SMB) protocol to propagate, targeting MRI and CT scanners, which ran on XP workstations. These medical devices were encrypted and held for ransom, which prevented safe and effective patient treatment.

The attack was estimated to have affected more than 200,000 computers across 150 countries, with estimates of total damages ranging from hundreds of millions to billions of dollars.

Above: Mapping the capabilities of the LDRA tool suite to the guidelines of IEC 62304:2006 +AMD1:2015

Defence in depth
The diversity in the nature of attacks illustrates why no single defensive measure can ever solve the problem.

There needs to be basic housekeeping such as updating older operating systems, securing protocols, and updating and validating software and firmware. But even with those precautions, there are countless ways to attack a system and an attacker only needs a single vulnerability.

According to Professor James Reason, many aspects of medical endeavour require human input and the inevitable human error that goes with it. But generally, there are so many levels of defence that for a catastrophe to happen, an entire sequence of failures is required.
Reason likened this to a sequence of slices of ‘Swiss Cheese’, except that in his model the holes in the ‘slices’ are forever moving, closing, widening and shrinking.

Just like the checks and controls applicable to human input into medical systems, a multiple-level approach to cybersecurity makes a great deal of sense, such that if aggressors get past the first line of defence, then there are others in waiting.

Approaches and technologies that can contribute to these defences include secure network architectures, data encryption, secure middleware, and domain separation.

The medical devices deserve particular attention, however. For an aggressor, the infrastructure surrounding them is a means to an end and only the devices themselves provide the means to threaten.

Medical devices and cybersecurity
In the past, embedded medical software has usually been for static, fixed-function, device-specific applications. Isolation was a sufficient guarantee of security. The approach to cybersecurity and secure software development tended to be reactive: develop the software, and then use penetration, fuzz, and functional test to expose any weaknesses.

The practice of “patching” to address weaknesses found in the field is essentially an extension to this principle, but device manufacturers have a poor track record of delivering patches in a timely fashion.

An implicit acknowledgement of that situation, in October 2018 the MITRE Corporation and the FDA released their “Medical Device Cybersecurity” playbook consisting of four phases – preparation; detection and analysis; containment, eradication and recovery; and post-incident recovery.

Adopting a proactive approach to cybersecurity in medical devices “Preparation” is perhaps the key element to take from this incident response cycle – not only in identifying the security measures that are in place for existing devices but in proactively designing them into new products.

One approach to designing in cybersecurity is to mirror the development processes advocated by functional-safety standards such as IEC 62304 ‘Medical device software – software life cycle processes’.

The IEC 62304 provides a common framework to develop software that full-fills requirements of addressing quality, risk and software safety throughout all aspects of the software development lifecycle.

Using a structured development lifecycle in this way not only applies best practices to the development lifecycle, but it also creates a traceable collection of artefacts that are invaluable in helping to provide a quick response should a breach occur.

Beyond the safety implications of any breach, this approach addresses the FDA recommends that any medical device must not allow sensitive data from being viewed or accessed by an unauthorised entity. The data must remain protected and accurate, preventing hackers from altering a diagnosis or important patient information.

Above: The Swiss Cheese model, illustrating how a sequence of imperfect defensive layers will only fail when those imperfections coincide

Secure code development
Compliance with the processes advocated by regulations can be demonstrated most efficiently by applying automated tools.

Although there are some differences between developing functionally safe and cyber-secure applications, there are many similarities too. For example, both perspectives benefit from the definition of appropriate requirements at the outset, and from the bidirectional traceability of those requirements to make sure that they are completely implemented.

Unit testing and dynamic analysis are equally applicable to both functional safety and cybersecurity too, and in the latter case is vital to ensure (for example) that defence mechanisms are effective, and that there is no vulnerability to attack where boundary values are applied.
IEC 62304 also requires the use of coding standards to restrict the use of the specified programming language to a safe subset. In practice, code written to be functionally safe is generally also secure because the same malpractices in programming language application often give rise to both safety and security concerns.

The growing complexity of the health delivery network

No connected medical system is ever going to be both useful and absolutely impenetrable. It makes sense to protect it proportionately to the level of risk involved if it were to be compromised, and that means applying multiple levels of security.

Medical devices themselves deserve particular attention because they provide the primary means to threaten. The structure development approach of a functional safety standard such as IEC 62304 can provide the ideal framework to apply a proactive approach to the development of secure applications.

Happily, many of the most appropriate quality assurance techniques for secure coding are well proven in the field of functional safety. These techniques include static analysis to ensure the appropriate application of coding standards, dynamic code coverage analysis to check for any excess “rogue code”, and the tracing of requirements throughout the development process.

The legacy of such a development process includes a structured set of artefacts that provide an ideal reference should a breach of security occur in the field. Given the dynamic nature of the endless battle between hackers and solution providers, optimising breach response times is not merely a good idea. It is a potential lifesaver.

Source: New Electronics

A new vision for AI in health-tech

 Recently established digital transformation unit, NHSX, has published a major new report – Artificial Intelligence: How to get it right. This dovetails with a £250 million investment in a new NHS AI lab to be run…

 Read More

…in collaboration between the Accelerated Access Collaborative and NHSX.

The report explores the roll-out of AI technology across the spectrum of healthcare, in:

  • diagnostics
  • knowledge generation (drug discovery, pattern recognition, etc)
  • public health (screening programmes and epidemiology)
  • system efficiency
  • precision medicine (also called P4 – predictive, preventive and personalised and participatory)

Organisations like Genomics England, with its resource of over 100,000 genomes and over 2.5 billion clinical data points offer an unparalleled opportunity to make advances in cancer diagnosis and treatment and rare disease analysis. Currently, diagnosis and screening are the major areas of application for AI, with over 130 products targeting over 70 different conditions under development. Achieving the benefits of AI throughout all potential areas of application is not guaranteed. The report identifies the major challenges and the work being done to address them.

An evidence-based approach

Importantly, the report focuses on real-world evaluation and evidence collection, as the best way to promote acceptance among both clinicians and patients.

Demonstrating the effectiveness of AI-based tools will be an important part of achieving widespread trust and adoption. A collaborative effort between NHSX, Public Health England, MedCity and health technology assessor NICE has produced an evidence standards framework for digital health technologies. This work will be taken forward by NICE with a pilot evaluation programme.

Regulatory uncertainty

The report highlights a confusing jungle of different regulators that have some involvement in the development of a health-tech innovation from concept to clinic. It is often not clear which body has oversight of which area of regulation – and indeed, some areas are not overseen at all (the quality of data used to train algorithms, for example). It is unsurprising that developers find the pathway difficult to navigate. The new NHS AI Lab will take on the task of helping to forge a clear pathway for innovators.

Many developers lack awareness of the NHS Code of Conduct for Data-Driven Health and Care Technology and the regulatory approval pathways they may need to follow. For example, half of the developers surveyed in a State of the Nation analysis had no intention to obtain certification of their technology as a medical device when this is often likely to be required.

International coordination

Internationally, development is outpacing the ethical and regulatory framework. A survey of members of the Global Digital Health Partnership highlights the way policy and regulation is trailing behind the development of AI applications in health and care. Regulators are working to catch up, but there is much more to do. Development of law and regulation in a piecemeal way, country by country, is likely to hold back development. The report highlights the efforts to achieve international coordination, such as Artificial Intelligence for Health (FG-AI4H) – a focus group set up by the World Health Organisation and the International Telecommunications Union, to work towards a standardised assessment framework for the evaluation of AI-based digital health technologies.

Where next?

Embracing the potential of AI to build and strengthen the NHS’s health tech offering will be an important next phase for patient care, but there are risks and challenges to be overcome. Addressing this head-on is welcome. The future of the NHS AI Lab will depend on political developments in the UK, but a continuation of the current policy direction promises real opportunities for both healthcare providers and technology developers

By Isabel Teare, Senior Legal Adviser

Source: Mills & Reeve

The role of a Software Architect

Although there is no exact and shared definition of what is the service of software architecture, I like to compare it with an architecture of buildings. In the sense that an architect normally has a big picture vision…

 Read More

…defining the discipline, setting priorities and steps. And in this article, we will look at the role of a software architect in software development projects.     

The main role of a software architect

  • The responsibility of being the “guardian of the vision”, in the sense that software architect must have and share a technical vision and technical direction, plan based on the requirements of the project.
  • On the other hand, software architect should know the disciplines he or she will use to build the system, for example, development environment or estimates or part of DevOps and even the basic methodologies (DDD, Continuous Integration, TDD … all good practices )
  • Should be able to transmit his or her knowledge to the team members, both vision, and disciplines.

Importance of software documentation

I still think that documentation in software development is crucial. It really helps to share the vision and make it clear for everyone in a team why certain technical decisions were made.

Importance of having a software architect in a software team

Based on my experience, I think that in general, software architecture decisions are critical, in the sense that a wrong decision can generate a lot of problems in terms of money and time. And vice versa good software architecture decisions help a team build working software thinking about scalability, performance and cost reduction.

Again, a good software architect can solve problems that the company was not able to solve during several years. And a bad one or a team without any software architect can turn a project of 2 weeks in a project of 1 year. Let me insist, not all “software architects” are good. I would say there are very few good ones in the world, but now software development teams start to understand the importance of having one and this field is developing quite fast. And proven by experience that 90% of critical software decisions are taken right by software architects. Moreover, he or she can improve the efficiency of a team, setting the right goals and principles.

The difference between a software architect and a software developer

I believe that a software architect should be a software developer, a good software developer. Software architect should not have pauses in writing code. And only gaining solid experience, working for several projects and achieving notable results a developer can evolve into an architect.

Let’s look at it more in details:

Software architect – it is not just a title, it is a way of thinking. The architect thinks mathematically or in other words, you can call it rational thinking. An architect takes into account a set of options and objectives and comes up with the optimal decision that makes a difference. He or she is responsible not only for the current sprint but for the whole project, thinking about the maintenance as well.

Software developer – normally a developer makes a specific decision at a specific time within his responsibilities.

To sum it up, let me highlight my thought that “software architect” is a mental state, a way of thinking, not a diploma. Software architect thinks about the system as a whole and analyzes it even at a macro level.



The AAC – nurturing innovation through NHS/industry collaboration

Collaboration between the NHS and industry was identified in the 2017 Life sciences: industrial strategy report as a powerful tool to support innovation. The role of the NHS both as a monopoly purchaser and a testbed for novel products means that collaboration with innovators can be a powerful driver of progress.

 Read More

Now the UK Government has announced an expanded role for NHS-industry collaboration in promoting innovations that will bring major benefits rapidly to NHS patients.

The Accelerated Access Review and its evolution into the Accelerated Access Collaborative

The Accelerated Access Review was set up in 2014 to see how access to innovative drugs, devices, diagnostics and digital products to NHS patients could be improved. A 2016 report on this initial project called for sustained focus and engagement to reap the full benefits of this collaborative approach.  Government renewed its commitment to working collaboratively in its 2017 Life Sciences Sector Deal. This set out plans to build on the AAR with a new Accelerated Access Collaborative (AAC), and £86 million of public funds to support innovators and the NHS in bringing forward innovative technologies.

An expanded role for the AAC

The Accelerated Access Collaborative was established in 2018 to provide a streamlined process through clinical development and regulatory approval for selected medicines, devices, diagnostic tools and digital services. Twelve diverse products have already been identified for rapid uptake. These range from a novel treatment for relapsing-remitting multiple sclerosis to diagnostic tests to detect pre-eclampsia and a novel system for treating benign prostatic hyperplasia.

The programme is now being expanded to become the main point of entry to the NHS for health innovations. This will include:

  • identifying the best new innovations
  • providing a single point of support for innovators whether within or outside the NHS
  • signalling the needs of clinicians and patients to innovators
  • establishing a testing infrastructure to generate evidence of effectiveness
  • directing funding to areas of greatest impact
  • supporting rapid take-up of proven innovations.

The remit of the AAC will be expanded under the chairmanship of Professor Lord Darzi so that it becomes the umbrella body across the UK health innovation eco-system. The new body will provide more joined-up support for innovators and a strategic approach to nurturing innovation. Dr Sam Roberts will take on the leadership of this project, alongside her current role as Director of Innovation and Life Sciences at NHS England and NHS Improvement. Voices representing the industry on the AAC board will include the ABPI and the BioIndustry Association.

This new programme, with its focused support and access to clinical evaluation within the NHS, offers a real opportunity to pharmaceutical and health technology innovators to fast-track promising products and services.


By Isabel Teare, Senior Legal Adviser

Source: Mills & Reeve

Strengthening Operational Resilience

How a leading UK high street bank is learning from the manufacturing industry to prevent operational disruptions to critical services and better protect customers

 Read More

 It was our pleasure to speak with the Strategic Delivery Lead, Service Delivery and Operational Resilience Centre of Excellence at one of the UK’s leading high street banks (henceforth known as BANK A).

Operational resilience refers to “the ability to prevent, respond to, recover and learn from operational disruptions to services, to survive and prosper, and not cause harm to customers and the wider market” (Bank of England definition). As a result of a 2018 discussion paper and a follow up 2019 consultation paper both produced jointly by the United Kingdom’s banking regulators Bank of England, the Financial Conduct Authority (FCA), and Prudential Regulation Authority (PRA), an industry-wide consortium is collaborating to address the ‘step change’ requested by the Bank of England. The consortium’s objective is to develop a regime of operational resilience disciplines to meet the regulatory bodies’ key concerns, namely to strengthen protection for customers and the economy and ensure institutions can withstand external threats and disruptions. This also addresses the major points raised in the UK’s Parliamentary Treasury Select Committee report “IT Failures in the Financial Services Sector.”

They recognized that customers are increasingly expected to use digital services yet these services are being significantly disrupted by IT failure. This results in customers being unable to make payments or withdraw cash and small businesses are left unable to conduct basic services to run their businesses. According to the Committee, the current levels and frequency of harm to customers is unacceptable. As a result, firms need to identify, stress test and set their own maximum impact tolerance levels of disruption and duration for important business services.

In researching and to address the ‘step change’ required, BANK A took inspiration from the manufacturing industry which uses a 3D modelling approach. Dassault Systèmes was invited to explore and develop a solution, bringing their digitalization, 3D modelling, and engineering heritage gained in manufacturing and regulated industries like aerospace. The following interview with BANK A provides an initial introduction to this project which could have a profound and positive impact on the financial services industry globally.

What are the key industry challenges?

The financial services industry has had solid and rigid procedures, processes and branch networks in place, built up over decades. Today’s digitally-savvy banking customers demand immediacy, ubiquity, and mobile connectivity for their financial needs. Banks can’t do what they did previously otherwise before long, they could end up in the industrial graveyard along with companies like Nokia and Kodak.

The power of technology has enabled banks to address these changes bringing opportunities such as mobile banking and chatbots. Change is one of Financial Services biggest causes of disruption such as cyber-attacks and loss of critical services. This becomes more of a challenge given the use of existing manual processes and static process maps that are used in financial services. These are expensive to maintain and can be inaccurate. There is more than one “golden source” of data, meaning that data will derive from a single place but can be extracted in different ways or duplicated across multiple systems. The problem is further compounded by the complexity of identifying the right data sources to power analysis and decision making. How do we stay on top of operational resilience, security and sustainable revenue growth in an industry that’s been set up for stability and rigidity and still keep up with the inevitable tempo of change? It’s like turning the Titanic but with the speed of a speedboat. It’s a fundamental challenge. Technology provides opportunities to also deliver robust risk management. To address the challenges, we need to put in place stronger operational resilience capabilities with a continuous review of processes and procedures to support that. This way we’re able to deliver business service continuity more effectively and efficiently to customers and keep them safe.

What role does the operational resilience function play in product development to address disruption and how does it drive innovation at BANK A?

Traditionally the role of operational resilience has been seen as a risk function. At BANK A, while it is an essential element of risk management, we take a holistic view of the function. We see it as a transformational one that helps strengthen trust and keeps our customers, the economy, and the bank safe. It enables us to see the bigger picture. As a by-product, it helps us better run the business, and opens up the opportunity to drive and implement operational and customer-centric product innovations. This allows us to keep delivering products and services, irrespective of any potential disruption. We enable change at pace without disrupting our customers’ ability to access services and products, protecting them, the bank and the economy.

The operational resilience function is viewed as a vital one and the department is immersed into the product and service development process from beginning to end. Change is the biggest known risk of failure. By including the function in operational management, it ensures we have the relevant resource and attention to achieve operational resilience and address our fundamental goal of protecting customers and safeguarding the bank and the country’s economy. The service delivery and operational resilience department sets the policies that new products and services must meet. By having a seat at the table, we assure policy compliance. This is important because as our products and services evolve, the policies also must evolve to achieve the desired business outcomes and provide even better protection and services for our customers.

Do you have any programs/initiatives in place to address these issues?

We realized technology can simulate the impact of changes to policies, processes, and products before we implement them, so we can fundamentally understand organizational performance in real-time. By doing the right thing, we’re significantly strengthening product service and delivery, strengthening trust in the bank and the banking sector as well as reducing costs.

Our quest to find a more innovative approach led us to Dassault Systèmes which helps clients in other industries simulate, visualize, and optimize business operations using their 3DEXPERIENCE platform and digital twins. For example, we discussed their “Virtual Singapore” project, a 3D twin of the city, that allows stakeholders to experience and interact with the virtual city so they can understand first-hand how it works and evaluate how it can be improved. This example, and others provided by Dassault Systèmes, convinced us that their approach could be applied to any organization, including to a bank, using a virtual representation on the 3DEXPERIENCE platform. We had C-level backing within BANK A to help make this happen, an important support for such a wide-ranging and strategically innovative project.

Virtual simulation can provide the regulators with a holistic view of the financial services industry. It enables them to understand the big picture and how all institutions are operating, assisting them to work towards the key objectives set by the Treasury Select Committee. In parallel, 3D modelling enables institutions internally to better protect the customers and demonstrate we’re meeting our own—and the regulators’—objectives. Dassault Systèmes’ experience in manufacturing brings us this capability, enabling us to connect the dots, see the big picture, and visualize the problems we’re solving.

Another major benefit of our work with Dassault Systèmes is that we need to prove what we are doing is the best approach. We need to stress test to obtain better insights from impact analysis and learn from the potential impacts across the whole system. In other industries, 3D modelling impact analysis is used as the standard to guarantee digital continuity. For example, in the case of an aeroplane, when an engine fails, engineers are certain another engine can take over because they have stress tested with a digital twin. In financial services, the use of a digital twin based on live data enables us to conduct stress testing and run simulations more frequently without causing any impact. This gives us the confidence to test a live system. The same approach used in aerospace should apply to financial services, given the need to protect the person on the street and allow him/her to access banking services and go about his/her daily life.

Dassault Systèmes’ value engagement (VE) model is being implemented to define and scope the project to find a solution. Together, we identified a scope within one of BANK A’s key services to prove the concept. Using our agile methodology, we are partnering to assess the current situation ‘as is’, identify key challenges and inefficiencies, and to ascertain how technology can help to achieve the ultimate goal of strengthening operational resilience to better protect our customers.

The model helps us to gather dispersed information from across the organization to support new capabilities. Part of the VE model involves using real processes, procedures and data to assess whether the Dassault Systèmes approach, could be instrumental in helping the regulators and the banks overcome operational resilience issues. It also helps identify technology used in other industries and how it could be adapted for financial services.

Due to the complexity of the business, it’s always a challenge to make one individual or organization accountable for an end-to-end service. This is difficult because you need access to the right data showing that the relevant actions are being taken inside the organization, let alone by external participants that play a key role in that product or service.

Using a solution on Dassault Systèmes’ 3DEXPERIENCE platform helps us see problems as they arise. These simulation capabilities help us assess new situations to see what we should be prepared for. For example, if a cyberattack affects a particular Windows build, what parts of our infrastructure are most vulnerable and what products and services would be affected?

We are looking to automate our processes so that we can deliver change faster and more effectively. We use agile development to help match the pace the business needs and we are also employing continuous deployment to ensure that employees and customers can access up-to-date capabilities as quickly as possible. Using this new platform will also help us introduce new decision making technologies like artificial intelligence. But we have to have the right data.

What benefits/positive aspects do you expect service continuity to bring?

The fundamental benefit of service continuity is that we are better placed to safeguard and protect our customers. It enables us to continue delivering indispensable services so customers can make essential purchases and live their daily lives. It enables us to eliminate the impact of negative consequences from external threats or disruptions. And by adapting Dassault Systèmes’ 3D modelling experience in other industries, we’re able to better visualize and conduct stress testing in financial services before incidents arise. An additional benefit is that we can also eliminate the risk of harm to an institution, preventing failure and an inability to meet shareholder obligations. As a consequence of keeping our customers safe, we’re more efficient and better positioned to make stronger business decisions to support sustainable growth. This then has a positive impact on the economy. 

By Dassault Systèmes

Source: CIMdata blog