1. Trang chủ
  2. » Thể loại khác

solutions for cyber physical systems ubiquity

509 113 0

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Định dạng
Số trang 509
Dung lượng 26,31 MB

Các công cụ chuyển đổi và chỉnh sửa cho tài liệu này

Nội dung

1 Hannes Plank, Infineon Technologies Austria AG, Austria Josef Steinbaeck, Infineon Technologies Austria AG, Austria Norbert Druml, Independent Researcher, Austria Christian Steger, Gra

Trang 2

Physical Systems Ubiquity

Independent Researcher, Austria

A volume in the Advances in Systems Analysis,

Software Engineering, and High Performance

Computing (ASASEHPC) Book Series

Trang 3

Web site: http://www.igi-global.com

Copyright © 2018 by IGI Global All rights reserved No part of this publication may be reproduced, stored or distributed in any form or by any means, electronic or mechanical, including photocopying, without written permission from the publisher Product or company names used in this set are for identification purposes only Inclusion of the names of the products or companies does not indicate a claim of ownership by IGI Global of the trademark or registered trademark.

Library of Congress Cataloging-in-Publication Data

British Cataloguing in Publication Data

A Cataloguing in Publication record for this book is available from the British Library.

All work contributed to this book is new, previously-unpublished material The views expressed in this book are those of the authors, but not necessarily of the publisher.

For electronic access to this publication, please contact: eresources@igi-global.com

Names: Druml, Norbert, 1980- editor.

Title: Solutions for cyber-physical systems ubiquity

/ Norbert Druml, Andreas Genser, Armin Krieg, Manuel Menghin, and Andrea

Subjects: LCSH: Cooperating objects (Computer systems) Handbooks, manuals,

etc | Internet of things Handbooks, manuals, etc | Automatic

control Handbooks, manuals, etc

Classification: LCC TK5105.8857 H367 2018 | DDC 004.67/8 dc23 LC record available at https://lccn.loc.

gov/2017012032

This book is published in the IGI Global book series Advances in Systems Analysis, Software Engineering, and High mance Computing (ASASEHPC) (ISSN: 2327-3453; eISSN: 2327-3461)

Trang 4

Perfor-Software Engineering, and High Performance Computing (ASASEHPC) Book Series

The theory and practice of computing applications and distributed systems has emerged as one of the key areas of research driving innovations in business, engineering, and science The fields of software engineering, systems analysis, and high performance computing offer a wide range of applications and solutions in solving computational problems for any modern organization

The Advances in Systems Analysis, Software Engineering, and High Performance Computing

(ASASEHPC) Book Series brings together research in the areas of distributed computing, systems and

software engineering, high performance computing, and service science This collection of publications is useful for academics, researchers, and practitioners seeking the latest practices and knowledge in this field

Mission

Vijayan Sugumaran Oakland University, USA

ISSN:2327-3453 EISSN:2327-3461

• Computer graphics

• Performance Modelling

• Enterprise information systems

• Computer System Analysis

• Virtual Data Systems

or visit: http://www.igi-global.com/publish/

The Advances in Systems Analysis, Software Engineering, and High Performance Computing (ASASEHPC) Book Series (ISSN 2327-3453)

is published by IGI Global, 701 E Chocolate Avenue, Hershey, PA 17033-1240, USA, www.igi-global.com This series is composed of titles available for purchase individually; each title is edited to be contextually exclusive from any other title within the series For pricing and ordering information please visit http://www.igi-global.com/book-series/advances-systems-analysis-software-engineering/73689 Postmaster: Send all address changes to above address Copyright © 2018 IGI Global All rights, including translation in other languages reserved by the publisher

No part of this series may be reproduced or used in any form or by any means – graphics, electronic, or mechanical, including photocopying, recording, taping, or information and retrieval systems – without written permission from the publisher, except for non commercial, educational use, including classroom teaching purposes The views expressed in this series are those of the authors, but not necessarily of IGI Global.

Trang 5

Large-Scale Fuzzy Interconnected Control Systems Design and Analysis

Zhixiong Zhong (Xiamen University of Technology, China) and Chih-Min Lin (Yuan Ze University, Taiwan) Information Science Reference • copyright 2017 • 223pp • H/C (ISBN: 9781522523857) • US $175.00 (our price)

Microcontroller System Design Using PIC18F Processors

Nicolas K Haddad (University of Balamand, Lebanon)

Information Science Reference • copyright 2017 • 215pp • H/C (ISBN: 9781683180005) • US $195.00 (our price)

Probabilistic Nodes Combination (PNC) for Object Modeling and Contour Reconstruction

Dariusz Jacek Jakóbczak (Technical University of Koszalin, Poland)

Information Science Reference • copyright 2017 • 312pp • H/C (ISBN: 9781522525318) • US $175.00 (our price)

Model-Based Design for Effective Control System Development

Wei Wu (Independent Researcher, USA)

Information Science Reference • copyright 2017 • 299pp • H/C (ISBN: 9781522523031) • US $185.00 (our price)

Comparative Approaches to Using R and Python for Statistical Data Analysis

Rui Sarmento (University of Porto, Portugal) and Vera Costa (University of Porto, Portugal)

Information Science Reference • copyright 2017 • 197pp • H/C (ISBN: 9781683180166) • US $180.00 (our price)

Developing Service-Oriented Applications Using the Windows Communication Foundation (WCF) Framework

Chirag Patel (Charotar University of Science and Technology, India)

Information Science Reference • copyright 2017 • 487pp • H/C (ISBN: 9781522519973) • US $200.00 (our price)

Resource Management and Efficiency in Cloud Computing Environments

Ashok Kumar Turuk (National Institute of Technology Rourkela, India) Bibhudatta Sahoo (National Institute of Technology Rourkela, India) and Sourav Kanti Addya (National Institute of Technology Rourkela, India) Information Science Reference • copyright 2017 • 352pp • H/C (ISBN: 9781522517214) • US $205.00 (our price)

Handbook of Research on End-to-End Cloud Computing Architecture Design

Jianwen “Wendy” Chen (IBM, Australia) Yan Zhang (Western Sydney University, Australia) and Ron Gottschalk (IBM, Australia)

Information Science Reference • copyright 2017 • 507pp • H/C (ISBN: 9781522507598) • US $325.00 (our price)

701 East Chocolate Avenue, Hershey, PA 17033, USATel: 717-533-8845 x100 • Fax: 717-533-8661E-Mail: cust@igi-global.com • www.igi-global.com

Trang 6



Preface xvi Acknowledgment xxiv

Section 1 CPS Applications and Trends Chapter 1

LocalizationandContextDeterminationforCyber-PhysicalSystemsBasedon3DImaging 1

Hannes Plank, Infineon Technologies Austria AG, Austria

Josef Steinbaeck, Infineon Technologies Austria AG, Austria

Norbert Druml, Independent Researcher, Austria

Christian Steger, Graz University of Technology, Austria

Gerald Holweg, Infineon Technologies Austria AG, Austria

Stephanie Grubmüller, Virtual Vehicle Research Center, Austria

Georg Stettinger, Virtual Vehicle Research Center, Austria

Martin Benedikt, Virtual Vehicle Research Center, Austria

Daniel Watzenig, Virtual Vehicle Research Center, Austria & Graz University of Technology, Austria

Chapter 4

AnEfficientChannel-AwareAloha-LikeOFDMA-BasedWirelessCommunicationProtocolforIoTCommunicationsinWirelessSensorNetworks 85

Elias Yaacoub, Arab Open University, Lebanon

Trang 7

Chapter 5

EffectiveApproachestoTrainingCPSKnowledgeandSkills 111

Christian Kreiner, Graz University of Technology, Austria

Richard Messnarz, ISCN GmbH, Austria & University of Applied Sciences Joanneum,

Austria

Section 2 Safety Concerns in CPS Chapter 6

RuntimeSafetyAssuranceforAdaptiveCyber-PhysicalSystems:ConSertsMandOntology-BasedRuntimeReconfigurationAppliedtoanAutomotiveCaseStudy 137

Tiago Amorim, Fraunhofer IESE, Germany

Denise Ratasich, Vienna University of Technology, Austria

Georg Macher, AVL List GmbH, Austria

Alejandra Ruiz, Tecnalia, Spain

Daniel Schneider, Fraunhofer IESE, Germany

Mario Driussi, Kompetenzzentrum - Das virtuelle Fahrzeug Forschungsgesellschaft mbH, Austria

Radu Grosu, Vienna University of Technology, Austria

Chapter 7

AMixed-CriticalityIntegrationinCyber-PhysicalSystems:AHeterogeneousTime-Triggered

ArchitectureonaHybridSoCPlatform 169

Haris Isakovic, Vienna University of Technology, Austria

Radu Grosu, Vienna University of Technology, Austria

Chapter 8

GenerationandVerificationofaSafety-AwareVirtualPrototypeintheAutomotiveDomain 195

Ralph Weissnegger, CISC Semiconductor GmbH, Austria

Martin Schachner, Graz University of Technology, Austria

Markus Pistauer, CISC Semiconductor GmbH, Austria

Christian Kreiner, Graz University of Technology, Austria

Kay Römer, Graz University of Technology, Austria

Christian Steger, Graz University of Technology, Austria

Chapter 9

ASelf-AdaptiveSoftwareSystemforIncreasingtheReliabilityandSecurityofCyber-PhysicalSystems 223

Johannes Iber, Graz University of Technology, Austria

Tobias Rauter, Graz University of Technology, Austria

Christian Kreiner, Graz University of Technology, Austria

Trang 8

Section 3 Security Concerns in CPS Chapter 10

Cyber-PhysicalSystemandInternetofThingsSecurity:AnOverview 248

Thomas Ulz, Graz University of Technology, Austria

Sarah Haas, Infineon Austria AG, Austria

Christian Steger, Graz University of Technology, Austria

Chapter 11

WhereDoAllMyKeysComeFrom? 278

Andreas Daniel Sinnhofer, Graz University of Technology, Austria

Christian Steger, Graz University of Technology, Austria

Christian Kreiner, Graz University of Technology, Austria

Felix Jonathan Oppermann, NXP Semiconductors Austria GmbH, Austria

Klaus Potzmader, NXP Semiconductors, Austria

Clemens Orthacker, NXP Semiconductors, Austria

Chapter 12

SecureandTrustedOpenCPSPlatforms 301

George Kornaros, Technological Educational Institute of Crete, Greece

Ernest Wozniak, fortiss GmbH, Germany

Oliver Horst, fortiss GmbH, Germany

Nora Koch, fortiss GmbH, Germany

Christian Prehofer, fortiss GmbH, Germany

Alvise Rigo, Virtual Open Systems, France

Marcello Coppola, STMicroelectronics, France

Chapter 13

Side-ChannelAttacksintheInternetofThings:ThreatsandChallenges 325

Andreas Zankl, Fraunhofer AISEC, Germany

Hermann Seuschek, Technical University of Munich, Germany

Gorka Irazoqui, Nagravision, Spain

Berk Gulmezoglu, Worcester Polytechnic Institute, USA

Chapter 14

IntegratingIntegrityReportingIntoIndustrialControlSystems:ARealityCheck 358

Tobias Rauter, Graz University of Technology, Austria

Johannes Iber, Graz University of Technology, Austria

Christian Kreiner, Graz University of Technology, Austria

Trang 9

Chapter 15

IntegrationofSecurityintheDevelopmentLifecycleofDependableAutomotiveCPS 383

Georg Macher, AVL List GmbH, Austria

Eric Armengaud, AVL List GmbH, Austria

Christian Kreiner, Graz University of Technology, Austria

Eugen Brenner, Graz University of Technology, Austria

Christoph Schmittner, Austrian Institute of Technology, Austria

Zhendong Ma, Austrian Institute of Technology, Austria

Helmut Martin, Virtual Vehicle Research Center, Austria

Martin Krammer, Virtual Vehicle Research Center, Austria

Compilation of References 424 About the Contributors 469 Index 480

Trang 10



Preface xvi Acknowledgment xxiv

Section 1 CPS Applications and Trends Chapter 1

LocalizationandContextDeterminationforCyber-PhysicalSystemsBasedon3DImaging 1

Hannes Plank, Infineon Technologies Austria AG, Austria

Josef Steinbaeck, Infineon Technologies Austria AG, Austria

Norbert Druml, Independent Researcher, Austria

Christian Steger, Graz University of Technology, Austria

Gerald Holweg, Infineon Technologies Austria AG, Austria

Inrecentyears,consumerelectronicsbecameincreasinglylocationandcontext-aware.Novelapplications,suchasaugmentedandvirtualrealityhavehighdemandsinprecision,latencyandupdaterateintheirtrackingsolutions.3Dimagingsystemshaveseenarapiddevelopmentinthepastyears.Byenablingamanifoldofsystemstobecomelocationandcontext-aware,3Dimaginghasthepotentialtobecomeapartofeveryone’sdailylife.Inthischapter,wediscuss3Dimagingtechnologiesandtheirapplicationsinlocalization,trackingand3Dcontextdetermination.Currenttechnologiesandkeyconceptsaredepictedandopenissuesareinvestigated.Thenovelconceptoflocation-awareopticalcommunicationbasedonTime-of-Flightdepthsensorsisintroduced.Thiscommunicationmethodmightclosethegapbetweenhighperformancetrackingandlocalization.Thechapterfinallyprovidesanoutlookonfutureconceptsandwork-inprogresstechnologies,whichmightintroduceanewsetofparadigmsforlocation-awarecyber-physicalsystemsintheInternetofThings

Chapter 2

UbiquitousIoTintheAutomotiveDomain:DecentralizedAdaptation 27

Laszlo Z Varga, ELTE, Hungary

UbiquitousIoTsystemsopennewgroundintheautomotivedomain.Withtheadventofautonomousvehicles,therewillbeseveralactorsthatadapttochangesintraffic,anddecentralizedadaptationwillbeanewtypeofissuethatneedstobestudied.Thischapterinvestigatestheeffectsofadaptiverouteplanningwhenreal-timeonlinetrafficinformationisexploited.Simulationresultsshowthatiftheagentsselfishlyoptimizetheiractions,theninsomesituationstheubiquitousIoTsystemmayfluctuateandthe

Trang 11

Chapter 3

InfluenceofNetworkConstraintsonModelingandAnalysisofCyber-PhysicalSystems 52

Stephanie Grubmüller, Virtual Vehicle Research Center, Austria

Georg Stettinger, Virtual Vehicle Research Center, Austria

Martin Benedikt, Virtual Vehicle Research Center, Austria

Daniel Watzenig, Virtual Vehicle Research Center, Austria & Graz University of Technology, Austria

Acyber-physicalsystem(CPS)isacompositionofanembeddedcomputer,anetworkandaphysicalprocess.Usually,theplant,whichrepresentsthephysicalpart,iscontrolledbyanembeddedsystem,which consists of computation, communication and control elements, via the global network. Thiscontributionfocusesonnetworkedcontrolsystems(NCSs)whichrepresentsaspecificclassofCPS.AstheproblemsofCPSsandNCSsarequitesimilarthegoalistotransferwelldevelopedtechniquesofNCSstoCPSsforanalysispurposes.NCSsdealwiththeanalysisoftheinteractionbetweenthephysicalsystemandthecybersystem.AmainchallengeofacontrolengineeristhedevelopmentofstableandrobustcontrollersforaNCS.ThesamegoalispresentinthedesignofCPS.Toensurethisgoaltheanalysisofsuchafeedbacksystemhastobeperformedwhichisnotstraightforwardandlimitedbytheusedmodelingapproach.Thisworkcomparesdifferentstate-of-the-artmodelingapproachesforNCSsandstabilityanalysismethodstherefore

Chapter 4

AnEfficientChannel-AwareAloha-LikeOFDMA-BasedWirelessCommunicationProtocolforIoTCommunicationsinWirelessSensorNetworks 85

Elias Yaacoub, Arab Open University, Lebanon

Wirelesssensornetworksconsistingofseveralsensorsdeployedinagivenarea,underaninternetofthings(IoT)paradigm,areconsidered.Sensornodesmayormaynotbecloseenoughtocommunicatewitheachotherinordertoperformcollaborativetransmissions.Acommunicationprotocolbasedonrandomaccessandorthogonalfrequencydivisionmultipleaccess(OFDMA)isproposedinordertoallowthesensorstooperateautonomouslybytransmittingtheirmeasureddatatoacentralprocessingsystem,whereitisprocessedandanalyzed.Wheneverithasdatatotransmit,eachsensorindependentlyaccessesatime-frequencyslotinaprobabilisticmannertoavoidcollisions.Acontrollingentity,e.g.,acentralbasestation(BS)coveringacertainsensordeploymentareareceivesthesensortransmissionsandprovidessynchronizationinformationbyperiodicallytransmittingapilotsignalovertheavailableOFDMAsubcarriers.Sensorsusethissignalforchannelqualityestimation.Resultsshowthatthisapproachperformswellintermsoftransmissiondataratesandcollisionprobability

Trang 12

Chapter 5

EffectiveApproachestoTrainingCPSKnowledgeandSkills 111

Christian Kreiner, Graz University of Technology, Austria

Richard Messnarz, ISCN GmbH, Austria & University of Applied Sciences Joanneum,

Austria

TrainingofskillsforengineeringCPSsystemsrequirestoconveydeeperunderstandingofcomplexandmultidisciplinaryprocessesandproducts.Companiesarefacingcomplexchallengesandencounterdemandsforspecializedskillsandinterdisciplinarycollaborationtoachievedependablemassproductsforendcustomers.Furthermore,industrydemandsflexibility,effectiveness,andefficiencyinqualifyingtrainees.Usuallytheyaskfor:deliveryesterday,notimecommittedtotraining,andperfectlyqualifiedtraineesasoutcome.Themodularmatrixcoursearchitectureisdescribedforthoseinneedtodeviseacomplextrainingforindustry.Itiscomplementedbybestpracticerecommendationsforcoursedevelopment,delivery,andcertification.ThetrainingandcertificationtoolboxoftheAQUAKnowledgeAllianceforAutomotivehasfullyimplementedthedescribedapproach,andservesasacasestudy

Section 2 Safety Concerns in CPS Chapter 6

RuntimeSafetyAssuranceforAdaptiveCyber-PhysicalSystems:ConSertsMandOntology-BasedRuntimeReconfigurationAppliedtoanAutomotiveCaseStudy 137

Tiago Amorim, Fraunhofer IESE, Germany

Denise Ratasich, Vienna University of Technology, Austria

Georg Macher, AVL List GmbH, Austria

Alejandra Ruiz, Tecnalia, Spain

Daniel Schneider, Fraunhofer IESE, Germany

Mario Driussi, Kompetenzzentrum - Das virtuelle Fahrzeug Forschungsgesellschaft mbH, Austria

Radu Grosu, Vienna University of Technology, Austria

Cyber-PhysicalSystems(CPS)providetheirfunctionalitybytheinteractionofvarioussubsystems.CPSusuallyoperateinuncertainenvironmentsandareoftensafety-critical.Theconstituentsystemsaredevelopedbydifferentstakeholders,who–inmostcases–cannotfullyknowthecomposingpartsatdevelopmenttime.Furthermore,aCPSmayreconfigureitselfduringruntime,forinstanceinordertoadapttocurrentneedsortohandlefailures.Theinformationneededforsafetyassuranceisonlyavailableatcompositionorreconfigurationtime.Totacklethisassuranceissue,theauthorsproposeasetofcontractstodescribecomponents’safetyattributes.Thecontractsareusedtoverifythesafetyrobustnessofthepartsandbuildasafetycaseatruntime.Theapproachisappliedtoausecaseintheautomotivedomaintoillustratetheconcepts.Inparticular,theauthorsdemonstratesafetyassuranceatupgradeandreconfigurationontheexampleofontology-basedruntimereconfiguration(ORR).ORRsubstitutesafailedservicebyexploitingtheimplicitredundancyofasystem

Trang 13

Chapter 7

AMixed-CriticalityIntegrationinCyber-PhysicalSystems:AHeterogeneousTime-Triggered

ArchitectureonaHybridSoCPlatform 169

Haris Isakovic, Vienna University of Technology, Austria

Radu Grosu, Vienna University of Technology, Austria

ACyber-PhysicalSystem(CPS)describesasystemorasystem-of-systemscloselyandactivelycoupledwithenvironment.Itcomprisesthedigitalintelligencesystem,aco-dependentphysicalsystem(i.e.,electrical,mechanical)andthesystemenvironment.Sincethebeginningofmoderncomputersystemsintegrationwaseverpresentchallenge,fromthehugesingleroomcomputerstotheIoT.Todayapplicationsinterleaveandbuildlargersystemswithdifferentsystemrequirementsandproperties.Implementationofsafetycriticalapplicationstogetherwithnon-criticalapplicationswithinthesameplatformisalmostinevitableinmodernindustrialsystems.Thisarticleprovidesaretrospectiveoverviewofthemajorintegrationchallengesandthecurrentproblemsinmixed-criticalityenvironments.Finally,itprovidesaninsightinahardwaresolutionwhichcreatesdeterministicplatformformixed-criticalityapplications

Chapter 8

GenerationandVerificationofaSafety-AwareVirtualPrototypeintheAutomotiveDomain 195

Ralph Weissnegger, CISC Semiconductor GmbH, Austria

Martin Schachner, Graz University of Technology, Austria

Markus Pistauer, CISC Semiconductor GmbH, Austria

Christian Kreiner, Graz University of Technology, Austria

Kay Römer, Graz University of Technology, Austria

Christian Steger, Graz University of Technology, Austria

Theelectrificationoftoday’svehiclesandthehighnumberofnewassistancefeaturesimplymoreandmorecomplexsystems.Newchallengesarearisingthroughheterogeneousanddistributedsystemswhichinteractwitheachotherandhaveanimpactonthephysicalworld,so-calledcyber-physicalsystems.Thesensingandcontrollingofthesesystemsistheworkofthehighlydistributedelectroniccontrolunitsanditisnosurprisethatmorethan100ofthesemicrocontrollersarecurrentlyintegratedinamodern(electric)car.Technological,organizationalanddesigngapsintoday’sdevelopmentflowsarenotcoveredbycurrentmethodsandtools.Therefore,newapproachesareessentialtosupportthedevelopmentprocessandtoreducecostsandtime-to-market,especiallywhensystemsaresafety-criticalanddemandreliability.Throughapplyingreliabilityanalysisandsimulation-basedverificationmethodsontheproposedmodel-baseddesignflow,weareabletoreducethenumberoftoolsinvolvedandachievecorrectness,completenessandconsistencyoftheentiresystem

Chapter 9

ASelf-AdaptiveSoftwareSystemforIncreasingtheReliabilityandSecurityofCyber-PhysicalSystems 223

Johannes Iber, Graz University of Technology, Austria

Tobias Rauter, Graz University of Technology, Austria

Christian Kreiner, Graz University of Technology, Austria

Theadvancementandinterlinkingofcyber-physicalsystemsoffervastnewopportunitiesforindustry.Thefundamentalthreattothisprogressistheinherentincreaseofcomplexitythroughheterogeneoussystems,software,andhardwarethatleadstofragilityandunreliability.Systemscannotonlybecome

Trang 14

Section 3 Security Concerns in CPS Chapter 10

Cyber-PhysicalSystemandInternetofThingsSecurity:AnOverview 248

Thomas Ulz, Graz University of Technology, Austria

Sarah Haas, Infineon Austria AG, Austria

Christian Steger, Graz University of Technology, Austria

Anincreaseofdistributeddenial-of-service(DDoS)attackslaunchedbybotnetssuchasMiraihasraisedpublicawarenessregardingpotentialsecurityweaknessesintheInternetofThings(IoT).Devicesareanattractivetargetforattackersbecauseoftheirlargenumberandduetomostdevicesbeingonline24/7.Inaddition,manytraditionalsecuritymechanismsarenotapplicableforresourceconstraintIoTdevices.Theimportanceofsecurityforcyber-physicalsystems(CPS)isevenhigher,asmostsystemsprocessconfidentialdataorcontrolaphysicalprocessthatcouldbeharmedbyattackers.WhileindustrialIoTisahottopicinresearch,notmuchfocusisputonensuringinformationsecurity.Therefore,thispaperintendstogiveanoverviewofcurrentresearchregardingthesecurityofdatainindustrialCPS.Incontrasttoothersurveys,thisworkwillprovideanoverviewofthebigCPSsecuritypictureandnotfocusonspecialaspects

Chapter 11

WhereDoAllMyKeysComeFrom? 278

Andreas Daniel Sinnhofer, Graz University of Technology, Austria

Christian Steger, Graz University of Technology, Austria

Christian Kreiner, Graz University of Technology, Austria

Felix Jonathan Oppermann, NXP Semiconductors Austria GmbH, Austria

Klaus Potzmader, NXP Semiconductors, Austria

Clemens Orthacker, NXP Semiconductors, Austria

Nowadays, cyber-physical systems are omnipresent in our daily lives and are increasingly used toprocessconfidentialdata.Whilethevarietyofportabledevicesweuseexcessivelyathomeandatworkissteadilyincreasing,theirsecurityvulnerabilitiesareoftennotnoticedbytheuser.Therefore,portabledevicessuchaswearablesarebecomingmoreandmoreinterestingforadversaries.Thus,arobustandsecuresoftwaredesignisrequiredfortheimplementationofcryptographiccommunicationprotocolsandencryptionalgorithms.Whilethesetopicsarewelldiscussedandsubjecttofurtherresearchactivities,theissueofprovisioningtheinitialdevicesetupiswidelyuncovered.However,theprotectionoftheinitialsetupisasimportantastheprotectionoftheconfidentialdataduringthetimeinuse.Inthiswork,theauthorswillpresentsolutionsforasecureinitializationofsecuritycriticalintegratedcircuits(ICs)

Trang 15

Chapter 12

SecureandTrustedOpenCPSPlatforms 301

George Kornaros, Technological Educational Institute of Crete, Greece

Ernest Wozniak, fortiss GmbH, Germany

Oliver Horst, fortiss GmbH, Germany

Nora Koch, fortiss GmbH, Germany

Christian Prehofer, fortiss GmbH, Germany

Alvise Rigo, Virtual Open Systems, France

Marcello Coppola, STMicroelectronics, France

Cyber-physicalsystems(CPS)aredeviceswithsensorsandactuatorswhichlinkthephysicalwiththevirtualworld.Thereisastrongtrendtowardsopensystems,whichcanbeextendedduringoperationbyinstantlyaddingfunctionalitiesondemand.Wediscussthistrendinthecontextofautomotive,medicalandindustrialautomationsystems.Thegoalofthischapteristoelaboratetheresearchchallengesofensuringsecurityinthesenewplatformsforsuchopensystems.AmainproblemisthatsuchCPSappsshallbeabletoaccessandmodifysafetycriticaldeviceinternals.Cyber-physicalattackscanaffecttheintegrity,availabilityandconfidentialityinCPS.Examplesrangefromdeceptionbasedattackssuchasfalse-data-injection,sensorandactuatorattacks,replayattacks,andalsodenial-of-serviceattacks.Hence,newmethodsarerequiredtodevelopanend-to-endsolutionfordevelopmentanddeploymentoftrustedapps.Thischapterpresentsthearchitectureapproachanditskeycomponents,andmethodsforopenCPSapps,includingtoolchainanddevelopmentsupport

Chapter 13

Side-ChannelAttacksintheInternetofThings:ThreatsandChallenges 325

Andreas Zankl, Fraunhofer AISEC, Germany

Hermann Seuschek, Technical University of Munich, Germany

Gorka Irazoqui, Nagravision, Spain

Berk Gulmezoglu, Worcester Polytechnic Institute, USA

TheInternetofThings(IoT)rapidlyclosesthegapbetweenthevirtualandthephysicalworld.Asmoreandmoreinformationisprocessedthroughthisexpandingnetwork,thesecurityofIoTdevicesandbackendservicesisincreasinglyimportant.Yet,side-channelattacksposeasignificantthreattosystemsinpractice,asthemicroarchitecturesofprocessors,theirpowerconsumption,andelectromagneticemanationrevealsensitiveinformationtoadversaries.Thischapterprovidesanextensiveoverviewofpreviousattackliterature.ItillustratesthatmicroarchitecturalattackscancompromisetheentireIoTecosystem:fromdevicesinthefieldtoserversinthebackend.Asubsequentdiscussionillustratesthatmanyoftoday’ssecuritymechanismsintegratedinmodernprocessorsareinfactvulnerabletothepreviouslyoutlinedattacks.Inconclusiontotheseobservations,newcountermeasuresareneededthateffectivelydefendagainstbothmicroarchitecturalandpower/EMbasedside-channelattacks

Chapter 14

IntegratingIntegrityReportingIntoIndustrialControlSystems:ARealityCheck 358

Tobias Rauter, Graz University of Technology, Austria

Johannes Iber, Graz University of Technology, Austria

Christian Kreiner, Graz University of Technology, Austria

Trang 16

Chapter 15

IntegrationofSecurityintheDevelopmentLifecycleofDependableAutomotiveCPS 383

Georg Macher, AVL List GmbH, Austria

Eric Armengaud, AVL List GmbH, Austria

Christian Kreiner, Graz University of Technology, Austria

Eugen Brenner, Graz University of Technology, Austria

Christoph Schmittner, Austrian Institute of Technology, Austria

Zhendong Ma, Austrian Institute of Technology, Austria

Helmut Martin, Virtual Vehicle Research Center, Austria

Martin Krammer, Virtual Vehicle Research Center, Austria

Theexcitingnewfeatures,suchasadvanceddriverassistancesystems,fleetmanagementsystems,andautonomousdriving,drivetheneedforbuilt-insecuritysolutionsandarchitecturaldesignstomitigateemergingsecuritythreats.Thus,cybersecurityjoinsreliabilityandsafetyasacornerstoneforsuccessintheautomotiveindustry.Asvehicleprovidersgearupforcybersecuritychallenges,theycancapitalizeonexperiencesfrommanyotherdomains,butneverthelessmustfaceseveraluniquechallenges.Therefore,thisarticlefocusesontheenhancementofstate-of-the-artdevelopmentlifecycleforautomotivecyber-physicalsystemstowardtheintegrationofsecurity,safetyandreliabilityengineeringmethods.Especially,fourengineeringapproaches(HARAatconceptlevel,FMEAandFTAatdesignlevelandHSIatimplementationlevel)areextendedtointegratesecurityconsiderationsintothedevelopmentlifecycle

Compilation of References 424 About the Contributors 469 Index 480

Trang 17

GENERAL INTRO/MOTIVATION

Around30yearsago,whentheInternetwasreleasedtothegeneralpublic,aminorityofpeoplewasactivelyusingitandalmostnonewasaffectedineverydaylife.In2000whenmobiledevicesstartedtospreadandsocialnetworksadvancedtowardsastaggeringsuccess,theInternetbecamemainstreamattractingmanymoreactiveusersandaddinganintegralvirtualcomponenttoourlives

TodayweareatthebeginningofanewInterneterahavingthepotentialtodisruptmanyindustriesandchangingthewaywework,thewaywelive,thewaywelearn

WhatisreferredtoastheInternetofThings(IoT)willimpactonenergyinfrastructure,healthcare,agriculture,transportation,thelifeinfuturecitiesorourfutureeducation.Governmentshaveestablishedframeworkssuchas‘SmartNation’,‘Industrie4.0’or‘TheSmartCitiesInitiative’toinvestineducation,researchandinfrastructuretofullyleveragethepotentialofIoT

IoTactivitiessoar,easilyobservablebytheincreaseofscientificpublicationsinrecentyearsandbylong-termstrategyadaptionstowardsIoTofmajorIT-industryplayers.McKinseyprojectstheIoTmarkettogrowto$11trillionby2025

SecurityaspectswillplayavitalroleinthesuccessofIoT.Self-drivingvehiclesthatcommunicatetoeachotherorenergygridsthatgetinteractiverequirehigh-securitymeasuresinordertoguaranteedataintegrity,preventfraudandmaintainsafety

IoTpavesthewayforastartupeco-systemthatwillbringmanynewplayersinthisgame.Naturallytheseplayersfocusonleveragingthepotentialofthetechnologytosolvesocietalproblems,howeveruntiltoday,arisingsecurityandsafetythreats,theylargelyneglect

ThereisanenormousnumberofchallengesfromindividualindustriesthathavetobejointlysolvedinordertopavethewaytomakeIoTbecomeubiquitousintoday’ssocieties.ThispublicationwillshowapplicationsinthecontextofIoT,arisingchallengesandhowsecurity,safetyandmixedcriticalitysolu-tionscanplayamajorroleinovercomingthem

CPS APPLICATIONS AND TRENDS

Cyberphysicalsystems(CPS)arethekeyinnovationdriverforimprovementsinalmostallmechatronicsystems.Theycanbefoundinautomotive,avionics,space,InternetofThings,industrialprocesscontrolandfactoryautomation(suchasIndustrie4.0andIndustrialIoT),healthcare,robotics,energyandsmartgrids,consumerappliances,civilinfrastructure,andmanyothers.Thus,CPSlinkthedigitalwiththe

Trang 18

ductionofassisted,andmostimportantly,automateddrivingvehicles.Fullyautomateddrivingwillbetheenablerformasteringoneofthemostcrucialchallengesofhumanity:safe,clean,andefficienttransportation.ThisrevolutionishappeningthanksthelatestadvancesinCPS,namelysofarunreachedqualityinenvironmentperception,availabilityofnovelnumbercrunchingcomputingarchitectures,andadvancesinintelligentpowertrainsandactuators.Thistrendwillenablecompletelynewapplicationsforoureverydaylives,suchastheintroductionoftransportationasapurchasableservice.Apartfromhugerevenuepotentialsforthewholeautomotivevaluechain,therearetwobenefitswhichshouldbeexplicitlyemphasized:

Inautomotiveandtransportationwearecurrentlyexperiencingamajorparadigmshift:theintro-• AccordingtotheEuropeanCommission,thetotaleconomicdamagecausedbycaraccidentsintheEUamountsto229bn€.Ifonetakesintoaccountthatfor76%ofallcaraccidentsthehu-manissolelytoblame,humancardrivingerrorsaccountforaneconomicdamageof174bn€.Therefore,advancesinCPSthatenableautomateddrivingapplicationswillvastlyimpactthesereparableeconomicandirreparablehumanlossesinapositiveway

• Trafficjamsandcongestionscause40%oftheroadtransportationbasedCO2emissions.Thesecongestionsandincreasedemissionsaccountforadditionalcostsof100bn€withintheEU.Byenablingassistedandautomateddriving,thankstotherecentadvancesinCPS,thesecostsandemissionscanbereducedthroughtechniquessuchasefficientflowmanagement,intelligenttrafficbalancing,improvedpathplanning,etc

performingcomputingplatformsareimperativerequirementsinordertocopewiththesheeramountsofsensordataandcomplexautomateddrivingalgorithms.However,today,thesein-vehiclecomputingsolutionsfeaturefail-safeorhighavailabilitycapabilitiesatmost.Onlybyachievingfail-operationalhardware/softwarearchitectures,automateddrivingfunctionscanbecontinuouslyprovided(withareducedsetofcapabilities)evenifafaultwasdetectedorenvironmentperceptionwasimpaired.Inaddition,thankstotheforthcomingfeaturesofCar-To-CarandCar-To-Infrastructurecommunication,securitybecomesanever-increasingconcern.Thissecuritytopicbecomesparticularlyimportantwhensecuritybreachesandsecurityattacksaffectthesafetyofthehumandriver.Therefore,safetyandsecurityco-engineeringisoneofthemostimportantCPSresearchanddevelopmentfieldinordertoenablethefuturerevolutionintransportation

Inordertoenablethefutureautomateddrivingrevolutionintransportation,fail-operationalandhigh-ductionsystems–theso-called4thindustrialrevolution.ThisrevolutionismainlydriventhroughrecentadvancesinCPS(inparticularnanoelectronicsandICT-basedsystems).Thepivotaltechnologiesofthis

Apartfromautomotive,wearecurrentlyalsoexperiencingamajorrevolutioninindustryandpro-xvii

Trang 19

revolutionareIndustrie4.0andIndustrialIoT(IIoT)thatenablecommunicating,intelligent,andself-ThemostcrucialchallengetobesolvedbyIndustrie4.0andIIoTissecurity.Recentsecurity-attacksdemonstratedthatevenindustrialcontrolsystems(ICSs)thatwereisolatedandthusnotconnectedtotheoutsideworldweretargetedsuccessfully.Stuxnet,oneofthemostfamousmalwares,wasdiscoveredin2010.Itsgoalwastomanipulatethephysicalproductionprocessbyloadingrougecodeontothecon-trollersofIranianuraniumenrichmentplants.ResearchconcludedthatmostofthecontrollersandIThardwarecurrentlyusedincriticalproductionsystemsdon’temploysecuritymechanismssuchasdigitalcodesigning.WelearnfromattackssuchasStuxnetandcountlessfurtherexamplesthatinIndustrie4.0andIIoTsecurityhasnotbeenaddressedrigorouslyenough.Thus,protectionagainstespionageandsabotageareofutmostimportanceinordertosuccessfullyestablishsmartfactoriesandtocontinuethecurrentlyongoing4thindustrialrevolution

SAFETY CONCERNS IN CPS

Computer-basedsystemsarepenetratingevermoreintoapplicationswhere,untilrecently,computingtechnologiesplayednosignificantrole.Examplesincludeembeddedsystemsthatareappliedincriticalapplicationdomainssuchasmedicaldevices,aerospacesystems,transportationvehiclesandfactoryautomation.Amalfunctionofsystemsthatsenseandcontrolthephysicalworldcouldleadtoseriousconsequencessuchaslossoflife,significantpropertyorenvironmentaldamage,orlargefinanciallosses.Consequently,ensuringthatsuchsystemsworkasintendedisofutmostimportance.However,sinceembeddedsystemshavetomanageevermoredemandingandcomplextasks,guaranteeingtheircorrectbehaviorunderanycircumstanceismorechallengingthanever.Thischallengeofcreatingdependablesystemsincreasesdramaticallywiththeincreasingcomplexityofcomputingsystems.Thetrendtoevermorecomplexprocessor-basedsystemscanbedemonstratedbytheamountofsoftwaredeployed.Popularexamplestoexpresstheenormouscomplexityofembeddedsoftwarecanbefoundintheavionicandautomotiveindustry.Forexample,theBoing787flightsoftwarecontrollingthefly-by-wiresystemincludesabout14millionlinesofcode(Lilienthal,2011)Evenamoderncarrunsabout100millionlinesofcode(Charette,2009),andthisnumberisgoingtogrowrapidlywiththeadventofautonomousdrivingtechniques

Thishighlevelofcomplexityofthesystemsitselfandtherelateddevelopmentprocesseshasneverbeenexperiencedbefore.Atthesametime,developersarefacingever-increasingdemandsonreducingthetime-to-marketwhichresultsinshortdevelopmentandinnovationcycles.Moreover,thereisthenecessityofmasteringessentialhorizontaltopicssuchasproductandlifecycleprocessquality,reliabil-ity,safety,andsecurity.Totacklethischallenge,interdisciplinaryexpertiseofengineersandmanagersisakeyfactortosuccess.Tocreatethisknowledgeappropriatetrainingmethodsareessential.Inthisbookacoursearchitectureisdescribedtotrainboth,employeesinindustryandstudentsatuniversities

Trang 20

FindingappropriatemethodsforCPStorecoverfromanunhealthystateisstillaresearchchallenge.Incriticaldomains,nondeterministicreconfigurationandinadequatesafetyassurancemightresultinunac-ceptablerisks.Forsuchdomains,functionalsafetystandardsareguidingassuranceactivitiesthatshouldbeexecuted.Thesestandardstypicallyproviderequirementsforriskidentificationandclassificationandproposehowtoreducetherisktoanacceptablelevel.Nowadays,allappliedmethodstofulfillthesafetystandardsrequirethattheentireCPSsystemandtheenvironmentinwhichthesystemwilloperatearewelldefinedandunderstoodatdesigntime.Currentlythisistheprerequisiteforthesubsequentdesignofnecessaryanalysisandmeasuresthatareintroducedforcreatingasoundandcomprehensivesafetyargument.However,theinherentuncertaintyofdynamiccompositionsandreconfigurationsofadaptiveCPSsystemsdoesnotallowtofullyanalyzethembeforeruntime.Tothisend,currentpracticerequiresthattheentiresystemandallsystemcontextsarewelldefinedandunderstoodatdesigntimesothatthenecessaryanalysiscanbeconducted,measurescanbeintroducedandasoundandcomprehensivesafetyargumentcanbedesigned.ThisprerequisiteisclearlynotfulfilledbyadaptiveCPSduetotheuncertaintyofdynamiccompositionsandreconfigurations,whichcanhardlybeforeseenandanalyzedbeforeruntime.So,establishedsafetyengineeringapproachesandstandardizationcanthereforenotbereadilyappliedforfutureCPSsystemsraisingtherequirementfornewinnovativeapproachessuchasthosepresentedinthisbook

SECURITY CONCERNS IN CPS

Asdescribedinearlierclauses,cyber-physicalsystemsorganizedintheInternetofThingsaredeployedtoawidevariationofdifferentenvironments.Aresultofthisdeploymentistheriskthatthesesystemsprocesssensitivedatasuchaspersonal,operationcritical,orsecretiveinformation.OnemainadvantageoftheIOT,it’sopenness,isalsooneofitsmajorweaknesses,potentiallyprovidingadversarieseasyaccesstodatatheycouldnotaccessbefore.Tocounteractafuturecatastropheofdatatheft,systemma-nipulationandcyber-terrorism,thecompletechainofoperationneedstobesecured.Thisstartswiththedeploymentofthedevicesandendswiththe(long-term)operationinverydiversefieldsofoperation.Thissituationnecessitatestheindustrialandresearchcommunitiestofindsolutionsinthefollowingcoarseengineeringdisciplines

Whenhandlingasecurecommunicationorstoringsensitiveinformation,italwayscomesdowntothesecurestorageofasecret.Beitthetemporarykeycreatedduringauthentication,amasterkeyfor

xix

Trang 21

Whenstoragerequirementshavebeenmet,itisofupmostimportancethateveryprocessingofthekeydoesnotopenittoanadversary.Assuchanattackercouldbeafinanciallypowerfulorganization,eventheinitializationofthedeviceinaproductionenvironmentneedstobesecured.Thesameistrueforanyupdateprocesswhenthedevicehasbeendeployed.Hence,securekeydistribution,initializationandauthenticationneedtobecarefullyconsideredduringsoftwareandsystemdevelopment

Authenticationanddataencryption,classicalsecurityengineeringtopics,providingtheprotecteddatatransferusingwelltestedopenstandards,isofhighimportance.Still,itwillonlyplayasmallroleinthispublicationwhichconcentratesonthespecificrequirementsofCPS

Similarlytotheproblemofproperkeystorage,securedataprocessingisamajorprobleminCPSmadeofoff-the-shelfcomponents.Asystemthathasnotbeendesignedtobesecureonahardware-level,requiressophisticatedcountermeasuresonasoftwareanddesignflowleveltoclosepotentialsidechan-nels.Also,differentdatagroupsneedtobeseparatede.g.toavoiddataleakagethroughsysteminternalsidechannelsandtoprovidedamagecontainment(ARM,2009)

Asdiscussed,CPSaredeployedtoanincreasingnumberoffieldsandassuch,somecommongroupscanbeidentified.Dependingontheattackscenariosandthepossibleimplicationsofasuccessfulassault,thefollowingarethemostcommontargets:

Machine to Machine Communication in Consumer Products

Comfortfunctionslikeregularcontentupdates,andtheintegrationofcloudfunctionalityprovidealsoeasyaccesstoconsumerproductsforadversaries.Theissuesrangefromrathersimpletheftofprivatedatatomanipulationofcriticalfunctions,ifpathsfromsuchconveniencefunctionalitytosafetyrelatedonesexist.Awidelyknownandalreadytargetedenvironmentwouldbeforexampleautomotivesys-tems.Inthiscasetherehavealreadybeensuccessfulattacksoncriticalpartslikebreaksthroughonlinemultimediatechnologyintheautomobile(Woo,Jo,&Lee,2015)

Sensors and Actors in Private Environments

Theincreasingspreadofhomeautomationsystemsraisesprivacyissueswithopenavailabilityofdirectsensordatainprivateenvironments.Ashacksofsimplewebcamerasshowed(Symantec,2017)harden-ingofexternalinterfacesisnecessarytoavoidtheriskofespionageorcriminalactivitylikeblackmail.Especiallycriticalisthelinkingofremotehomelockingmechanisms,whichcouldprovidecriminalelementswiththepossibilitytosenseifhomeownersareathomeandopeningthehouseforpotentialthieves

Trang 22

Critical Industrial Infrastructures

Complexsupplychainsandtheintroductionofdecentralpowergenerationusingrenewablepowersources,madeitnecessarytoprovideonlinecontrolandmaintenancefunctionality.Thisopenscriticalinfrastructurestopossiblecyberterrorists,blackmailersandthievesofintellectualproperty

Especiallywhencustomersaregiventhepossibilitytolinktheirconsumerproductswithunforeseenapplicationenvironments,suchasinbring-your-own-deviceworkplaces,catastrophicsecurityissuescanevolve

ORGANIZATION OF THE BOOK

Thebookisorganizedinto15chapters.Abriefdescriptionofeachofthechaptersfollows:

Chapter1describesaCPSapplicationinthedomainof3Dimaging.Through3Dimagingthedeviceisawareofitslocationandcanusethisinformationtotrackobjectsoreventoimproveopticalcom-municationapproaches.ThischapterdescribestheconceptofTime-of-Flightsensingandhowthiscanbeusedforopticalcommunication

Chapter2dealswiththechallengeofrouteplanningoptimizationintheautomotivesector.ThechapterdescribestheapproachtousedecentralizedubiquitousIoTtechnologiestoprovidenavigationsystemsliveinformationaboutthetrafficstatus.Thischapterdescribestheconceptandalsoshowssimulationresultsregardingtheachievedtravelingtimes

ationiscomparedtonetworkcontrolledsystemsandthesamealgorithmsareappliedandcomparedintermsofstabilityandrecommendationswhattouseforCPS.Thischapterprovidesamathematicalanalysisandadescriptionofdifferentsamplingmodels

Chapter3outlinesthechallengesofCPSregardingnetworkperformance.Thisperformanceevalu-Chapter4focusesonthetopicofwirelesssensornetworks.Forthisnetworksanorthogonalfrequencydivisionmultipleaccess(OFDMA)communicationprotocolisanalyzedandproposed.Thischaptercomparestheprotocolwithothersintermsofbitrates,collisionprobability,nodeoutages,etc

Chapter5focusesonthedomainofCPStrainingconcepts.Itdescribeshowcoursesshouldbestructured,thecertificationapproach,andhowtheycanbedeliveredtotheparticipants.ThischapterdescribesthestructureindetailandthealreadyestablishedprojectcalledAQUA.ThecoursesdomainismainlysituatedonCPSforautomotive

Chapter6describesConSertsMwhichisusedtoevaluatethesafetyofnewsystemcompositionsatruntime.Eachcomponentprovidesacertificate/contractaboutitssafety.Thischapterprovidesadescrip-tionhowthissystemworksincludingacasestudy

Chapter7examinesmixedcriticalitysystemsanditschallenges.TheCPSthemselvescanperformdifferentapplicationsatoncewhilehavingtofulfillsafetyrequirementsforsomeapplications.Thischapterdescribesmultipleconceptsandtheirintegrationandprovidesausecaseanalysis

Chapter8presentsadesignandverificationprocessforsafety-criticalsystems.ThisprocessusesUMLtorepresentthedesignflow.Thisflowcoversthedesign,thearchitecture,andthevirtualprototyp-ingofthesystem.Thischapterprovidesabackgrounddescriptionandadescriptionofthestepsdowntothevirtualprototype

xxi

Trang 23

Chapter10focusesonsecurityforInternetofThings(IoT).Thechapterissplitintogivinganoverviewofpossibleattacks,researchtrends,andsecuritymechanismsinsoftwareandhardware.Also,commonlyusedlayersfromapplicationtothephysicallayeraredescribedincludingreferencestogetadditionalinformationaboutthesingleaspects

Chapter11writesaboutkeymanagementforIoTdevices.Thisinvolvesthecompletelifecycleoftheproductandisdividedinto6phases,describingtopicslikethepersonalization,authenticationmethods,ownershipsofkeys,etc

Chapter12offersadetailedoverviewofopenCPSplatforms.Describinghowthearchitecturecanbeadjustedtomaintainsecuritywhileallowingtoupdatethesystemandaddnewsoftwarecomponentsduringruntime.Thischapteralsoproposesamodelbasedtoolchainfordevelopment

Chapter13writesaboutsidechannelattacksondevicesintheIoT.Theauthorsdescribeatrustedexecutionenvironmentwhichincludessidechannelattackresistantprocessorcores.Alsocombinedat-tacksarehighlightedinthischapterbycapturingmultiplesourcestoleakinformation

tectionsystemsbasedonintegrityanalysis.Additionally,integrityreportingtechnologiesareevaluatedintermsofoperationaloverhead,configurationupdates,developmentprocess,andsecureprovisioning.Chapter15concludesthesecuritypartofthisbookintheautomotivedomain.Thischapterdescribesthechallengetoprovidebothsafetyandsecurityinonesystem.Theauthorsprovideasolutioninformofacooperativesafetyandsecurityframework.Additionally,severalmethodsaredescribedtoperformasecurityandsafetyanalysisinparallel.Toconcludethechapteracasestudyispresented

Trang 24

Charette,R.N.(2009).This Car Runs on Code.Retrievedfromhttp://spectrum.ieee.org/transportation/

Woo,S.,Jo,H.J.,&Lee,H.L.(2015).APracticalWirelessAttackontheConnectedCarandSecurity

ProtocolforIn-VehicleCAN.IEEE Transactions on Intelligent Transportation Systems,16(2).

xxiii

Trang 25

We,theeditors,wouldfirstliketothankallwhohavebeeninvolvedintheprocessofbringingthisbooktolife

Ouracknowledgmentsgospecificallytothereviewersofthesubmittedchapters.Theywereagreathelpinsteeringbookcontentsandsignificantlyimprovingquality

flectedgreatlybythedifferentanglestheauthorslookedatit.Manyimportantaspectsarereflectedrangingfromsecuritytopedagogicalaspects.Onlythesesubstantialscientificcontributionsmadethisbookpossible.ThankYou

Trang 26

CPS Applications and Trends

Trang 27

Localization and Context

Determination for

Cyber-Physical Systems Based

Trang 28

A crucial part of operation for certain devices is localization While depth sensors provide geometric information about the immediate surrounding, determining location and orientation within a certain co-ordinate system is a challenge of its own This chapter explores the opportunities depth sensing systems provide to localization A focus is set on applications in fields such as consumer electronics, internet of things and autonomous robots Localization and tracking of electronic devices has a long history and has seen the use of a variety of different principles This work focuses on the fields of high performance localization based on optical and sensor fusion solutions Localization principles in general can be cat-egorized into passive, guided and cooperative solutions.

A passive system is able to determine its position in a local or global coordinate system without external help An increasing number of applications also require information about the orientation of the device A combination of position and rotation sensing is often referred to as pose determination A pose has of six degrees of freedom (dof) and completely describes the static position and orientation of

an entity in 3D space Each axle in 3D space presents one degree of freedom for the position and one degree for rotation around the axle Passive 6-dof localization is often used in computer vision based positioning systems, where features are matched with prerecorded databases Early examples are cruise missiles using terrain contours for navigation

A well-known example for guided localization is GPS, where devices use the aid of satellites to termine their position Cooperative localization solutions use a communication channel, which is often used for active identification and position forwarding Optical tracking, using image sensors and active markers is an example for cooperative tracking In such system, an external base-station with an image sensor can sense a tracked device equipped with active LED markers, and has the ability to toggle the LEDs for identification Another example are beacon based systems, where active beacons forward information about their location

de-When classifying the location-awareness of cyber-physical systems, it is important to distinguish between localization and tracking While these terms are sometimes used ambiguously, tracking is commonly used in a relative context, where the registration of movements is important Tracking the pose of a device does not always lead to the determination of a position within a meaningful coordinate system However relative position and rotation changes can be detected For certain applications, this is sufficient and no broader localization is required Examples for such systems are instruments measuring rotations, such as gyroscopes or compasses, some 3D scanning solutions or human interface devices.Localization is often associated with position determination without focus on detecting relative pose changes A combination of tracking and localization is used in a lot of location-aware systems and leads

to localization at a high update rate Tracking and localization are often performed by different sensors, because localization solutions often lack of the desired accuracy to track relative pose changes While localization can provide the position and orientation within a greater context, tracking sensors provide the accuracy and update-rate required for the application

Trang 29

A great example of sensor fusion for localization and tracking is Wikitude (2016) This smartphone application provides augmented reality on smartphones It annotates the video stream of the internal camera with meaningful information about the environment and displays it on the screen GPS or WIFI

is used for positioning The absolute orientation is determined by the gravity and compass sensors The gyro sensors are used to track movements to enable a high update rate of the rotation This enables to robustly attach information to certain landmarks and directions in the smartphone video stream.Arising technologies such as virtual and augmented reality, autonomous driving and indoor localization demand precise pose determination at very high update rates These demands are tackled in state-of-the-art systems with a sensor fusion approach, combining optical and inertial sensors Optical sensors are used in the form of 2D and 3D cameras, LiDAR (light detection and ranging) and optical triangulation.Data fusion with inertial sensors can compensate the flaws of optical sensors Most optical positioning sensors require a line of sight connection and sometimes feature a slow update rate and too much latency

as well as visual artifacts such as motion blur Inertial sensors are commonly miniaturized, using MEMS technology and feature high update rates These sensors are already well-established in mobile devices Inertial sensors however base their measurements on differential movements and rotations In order to measure absolute movements and rotation, the measurements need to be integrated This introduces an integration error which introduces drift (Woodman, 2007) This drift can be compensated by fusing these measurements with non-drifting data, as optical systems can produce

Using cameras for localization is traditionally accomplished by sophisticated computer vision ods, which are often solely based on 2D images A common approach is simultaneous localization and mapping (SLAM), where a 3D representation of the environment is created during localization Depth sensors are capable to improve the performance of such monocular SLAM systems (Steinbrücker, Sturm,

meth-& Cremers, 2011) A prominent example is Google Tango, which uses a number of sensors, including

a Time-of-Flight depth camera for SLAM based localization on mobile devices

Depth sensing systems are devices capable to directly gather geometric information about the ate environment A measurement typically consists of a coordinate in three dimensions, usually relative

immedi-to the depth sensor itself In a depth camera, every pixel produces such measurement If the shutter of the camera triggers all pixels at the same time, every measurement is captured simultaneously Since all available depth imaging systems are limited in range and show systematic measurement errors, some systems also attach a confidence measure to each measurement

One of the reasons that the vast research effort on optical localization systems are based on 2D cameras, is that the field of depth sensors is much younger than 2D sensors and they are not yet part of common vision systems This might change in the smartphone and tablet domain, since as of 2016 the first smartphone featuring a Time-of-Flight depth camera appeared on the consumer market Measur-ing depth based on the Time-of-Flight principle is the most miniaturized solution available and has the flexibility to be used in a manifold of applications

DEPTH IMAGING FOR 6-DOF LOCALIZATION

This section introduces current depth imaging solutions with focus on Time-of-Flight technology Depth sensors alone are usually not directly associated with localization, although they provide 3D-context awareness of the immediate surrounding A sensor fusion approach, incorporating depth sensors however offers advantages in SLAM based systems (Steinbrücker, Sturm, & Cremers, 2011), and can improve

Trang 30

tracking precision in general In this work, we focus on depth sensing based on Time-of-Flight, since it

is the most miniaturized solution, and is the only depth sensor, which can be found in mass produced smartphones We also present a concept in this chapter, where Time-of-Flight sensors are directly used for location-aware optical communication, closing the gap between depth sensing and localization

THE PRINCIPLE OF TIME-OF-FLIGHT DEPTH SENSING

Time-of-Flight imaging is based on measuring how long light takes to travel from the sensing system

to the scene and back to a photosensitive sensor These systems can be distinguished by their operating principle, as illustrated in Figure 1 Direct Time-of-Flight systems emit short light pulses and measure the time between emission and reception Each pulse corresponds with one measurement A prominent example is LiDAR, where bundled light pulses are emitted by a laser and detected by photodiodes The angle of these pulses is modulated to receive a spatial image of the surroundings Indirect ToF imaging sensors relay on a continuous wave approach

The operating principle of this indirect approach is illustrated in Figure 2 and works by emitting pulsed infrared light at a wide radiation angle An image sensor receives the reflected light and is able

to measure the phase-shift between incoming and outgoing signal, which is proportional to the distance.The emitted light pulses usually have a frequency of up to 100 MHz They originate from an active illumination unit, which typically consists of an infrared LED or vertical cavity surface emitting laser (VCSEL) The pulses travel to the sensed object and back to the image sensor The lens projects the light onto a ToF image sensor

Each pixel of the ToF sensor contains a photonic mixer device, which produces a signal ing to the phase-shift between the outgoing and incoming light (Tobias, 2005) The incoming photons generate charge-hole pairs in the silicon, which are moved into either of two charge buckets A and B This is decided by the logical state of the modulation signal This signal is generated on the ToF sensor and is also used to emit light pulses In order to control the PMD devices, the signal is directly supplied

correspond-to each pixel The charge difference between bucket A and B is the output of the ToF sensor and is related to the phase-shift of the incoming and outgoing signals This phase output value however also depends on the amount of incoming light Reflection properties of the sensed material and as well the

Figure 1 The different principles of Time-of-Flight based depth measuring

Trang 31

distance influence this value In order to determine the exact phase-shift, the most common way is to conduct four or more different measurements In these measurements, the phase-shift offset between the outgoing light and the modulation signal is changed, producing four or more different phase images.

A well-established procedure is to take four measurements I with four equidistant phase offsets (e.g

0°, 90°, 180°, 270°) and calculate the phase-offset by the following relation:

Since p is proportional to the phase-shift of a pulsed light signal, the phase values wrap and start

again at zero, if the distance is too long Time-of-Flight sensors are capable to change their modulation frequency in order to produce another different set of four phase images to unwrap these ambiguous phases The final measurement for a depth image consists then of eight phase-images, which results in

a larger range, while maintaining precision The tradeoff is a decreased frame-rate and potential motion artifacts The eight-phase mode is commonly used in SLAM applications, which prefer depth quality over frame-rate

The drawback of Time-of-Flight sensors is limited resolution of the depth image, since each pixel contains the circuit of the photonic mixer device The limited photo-sensitive area on the silicon is compensated partly by using micro-lenses directly applied above the silicon to focus light to the photo-sensitive areas of a pixel Since continuous wave Time-of-Flight sensing is the most miniaturized depth sensing system available, increasing the sensor size to enhance resolution or photosensitivity is often not feasible Another drawback is the sensitivity to incoming background light Despite countermeasures such as background illumination suppression (BSI) and infrared filters, very bright light sources such

as reflected sunlight can reduce the signal to noise ratio, but do not directly influence the measurement

Figure 2 The working principle of continuous wave indirect Time-of-Flight 3D imaging

Trang 32

ALTERNATIVE DEPTH IMAGING APPROACHES

Each available depth imaging system has its own trade-offs and no system is yet predominant Compared

to Time-of-Flight, all image-sensor based approaches need a certain size to conduct measurements based

on triangulation Stereo depth sensors usually consist of two cameras which are mounted in parallel with

a certain distance This baseline between these cameras is necessary, because depth is measured by the pixel offset between features in two images A short baseline impairs depth quality and range However, progression in research and increased sensor resolution made rather small baselines feasible, like they are found in form of dual camera systems in smartphones Stereo cameras usually produce sparse depth measurements, since it is not possible to measure feature disparities of homogenous surfaces The depth image’s x/y resolution however is superior to most other depth sensing principles, because high reso-lution image sensors can be utilized for stereo A sensor fusion approach, combining ToF and stereo depth sensors (Zhu, Wang, Yang, Davis, & Pan, 2011), is a promising solution, when high quality depth images are required

A variation of stereo sensors is structured light (Scharstein, 2003) Structured light utilizes a tor in order to project a pattern onto the scene An infrared camera senses the projected pattern Since the projected pattern is predefined, the operating principle is similar to a stereo camera setup, as both systems are based on extracting the feature disparity caused by the distance to the camera The famous Microsoft Kinect sensor is based on this technology Unlike stereo, dense depth images cannot be gath-ered, as the projected pattern is also reflected on homogenous surfaces Due to the active illumination principle, structured light based systems can also operate in low light conditions The drawback of this system is that the active illumination requires more energy, like ToF, and the system is also impaired by ambient light Unlike ToF, such systems also require a certain distance between projector and camera.Depth imaging based on LiDAR is most commonly used in automotive applications Unlike image sensor based approaches, the angular variation of LiDAR measurements does not stem from optical projection onto an imaging sensor, but from mechanical scanning devices This mitigates multi-path interference, and allows the usage of highly sensitive photo elements, such as single photon avalanche diodes This increases the effective range at the cost of more complex sensing systems

projec-DIRECT LOCALIZATION WITH DEPTH SENSORS

Depth imaging sensors currently are not commonly used in the field of localization of electronic vices One of the reasons is that most of the research on image based localization is focused on the far more established and ubiquitous color cameras 3D data does not have as much variation and distinctive features as 2D images Geometry is more generic and repetitive than reflected light Due to the measure-ment principles, 3D imaging also suffers from resolution and limited range Geometry however is more consistent, since it is not influenced by different illumination With depth sensors, it is possible build dense 3D models of the environment, as demonstrated by Newcombe et al (2011) in their Kinect Fusion approach Such 3D models can be used for re-localization and mapping the environment for systems such as autonomous robots Another application for such high-quality depth maps is augmented reality Due to the integration of depth data to a dense model, it is possible to embed virtual objects, using the high-quality depth data

de-Biswas and Veloso (2012) present an indoor navigation system based on depth information using a plane filtered pointcloud and Monte Carlo localization Since solely depth sensors are used, the robots

Trang 33

can also dynamically avoid obstacles, which are not in the reference map While these approaches might work in a smaller context, with distinctive geometry, sensor fusion approaches are more favorable for localization Later in this chapter, we explore the concept of using Time-of-Flight depth sensors in a novel way to establish location-aware optical communication links to embedded devices If these de-vices forward positional information, highly accurate cooperative localization is possible by solely using miniaturized depth sensors.

DEPTH AND COLOR SENSOR FUSION

Depth and color cameras can be combined to create a unified RGB-D sensor, capable to capture images containing both depth and color information When a SLAM or visual localization system is provided with depth information, the system can either build better 3D maps, but also benefit in robustness from the immediately available distance data (Steinbrücker et al., 2011) The process of 3D and 2D data fu-sion can also improve depth image resolution, if the required processing power is available This is due

to the principle that depth edges often correlate with edges in color and intensity images Color edges without any depth discontinuity can be simply ignored, when the lower resolution depth image does not show any variation Since depth imaging systems usually lack high resolution, color edge information can be used to interpolate and upscale depth images in a meaningful way

Research has produced a large number of image-guided depth upscaling algorithms (Chetverikov, Eichhardt, & Jankó, 2015) The input for most methods is a high resolution color image and sparse depth measurements which are mapped into the color image space This sparse RGB-D image is produced by 3D measurements which are mapped to the 2D image The requirement for such mapping is knowledge about the intrinsic camera parameters of both cameras, which include distortion coefficients, focal length and pixel offsets of the optical centers These parameters are usually derived by checkerboard calibra-tion, which is also possible for Time-of-Flight depth cameras, since they are able to produce a grey-scale intensity image The extrinsic parameters of a dual camera system are in this case a translation vector

T and a rotation matrix R , which describe the transformation from the depth camera’s coordinate

system to the color camera These parameters can also be gathered by capturing images of a checkerboard pattern, and using a toolchain, such as Camera Calibration toolbox (Bouguet, 2016) If intrinsic and

extrinsic camera parameters are known, depth measurements with depth d i j, and pixel position x i j, can

be mapped to 3D space of the color camera, using the pseudoinverse P of the intrinsic depth camera

= +

The 3D measurements X i j, can be projected to 2D image space coordinates v i j, by multiplication

with the intrinsic matrix I of the color camera.

v i j, =I X i j,

Trang 34

on parallel processing systems such as GPUs The joint bilateral filter, developed by Kopf et al (Kopf, Cohen, & Lischinski, 2007) can be executed in parallel and works by weighting image filter kernels on color similarity This works well with relatively small upscaling factors If, however, more than just a few pixels between depth values need to be interpolated, depth values influence pixels despite edges between the interpolated pixel and the original depth value We therefore developed an edge-aware interpolation approach which is optimized for GPUs on mobile devices (Plank, Holweg, Herndl, & Druml, 2016)

In this approach, sparse depth values are mapped to high-resolution color images Each depth value propagates its influence among circular paths to the surrounding pixels If edges are crossed, the influ-ence drastically decays If no edges are crossed, the influence is evenly distributed, suppressing sensor noise Our prototype implementation is capable of producing 13 frames per second, when executed on GPUs on mobile devices Beside the capability of SLAM systems to create better geometric models of the environment, the availability of high resolution 1-1 mapped depth and color images, enables better context awareness, since combined depth and color data benefits 2D object detection algorithms (Yu

& Zhao, 2012)

A rather unexplored issue is synchronization among camera systems Most academic work assumes that color and depth sensors operate in synchronous mode, since the technical solution to a synchronous camera system seems trivial In practical applications, however, synchronization is often not feasible

as it usually requires tight cooperation across multiple hardware vendors The vast majority of image sensors are developed to be integrated into monocular systems, not offering any option for hardware synchronization For RGB-D SLAM systems however, synchronization can be avoided by using ad-ditional sensors Such system is usually in motion while the environment remains static If color and depth cameras gather information at different times, this can be corrected by using motion tracking data from an inertial sensor Inertial sensors can operate at a high update-rate If all measurements, includ-ing the depth and color images are accurately timestamped, the actual pose difference between depth image and color image can be calculated by transforming the depth data by the relative pose derived by the motion tracking system By using physical models of the tracking system, the relative pose can be even more refined

Ovren et al (2013) introduce this approach in their attempt to correct the effect of rolling shutter age sensors A rolling shutter is caused by the pixel readout process, and means, that not all color pixels are captured at the same time Inertial based depth image alignment is only possible with static scenes, because they are only able to compensate the motion of the camera system

Trang 35

im-LOCALIZATION WITH 2D IMAGING SENSORS

Feature Based Localization

Color information alone can be directly used for localization by matching visual input data against tabases (Vedaldi & Soatto, 2008) This works by finding features, which are regions or points in input images which are significant and distinctive Feature descriptors are an abstract representation of these regions with the goal of being able comparable to other features, while being resilient against pose and intensity variations These feature descriptions can be stored in a visual dictionary, associating these features with localization information These databases are either generated systematically via 3D scan-ning systems, or using topologic information If these features are recognized by a vision system, the pose of the system can be calculated by triangulation This kind of visual localization can also be used

da-to initialize a visual SLAM system (Lepetit, Arth, Pirchheim, Ventura, & Schmalstieg, 2015) SLAM can provide more accurate localization, by creating a 3D data representation of the sensed environment

Visible Light Localization

The motivation of visible light localization (VLC) is caused by the increasingly ubiquitous LED lights Due to the fast switching speed, it is possible to transmit information from repurposed existing illu-mination systems With appropriate modulation, it is possible to use lights for communication without perceivable flickering Visible light based localization has different applications with different demands

in precision Current products, such as Qualcomm Lumicast (Jovicic, 2016), repurpose image sensors for visible light localization The distinction between vision based methods with active targets and visible light localization is that base-stations transmit information to help with localization This can be either IDs or positional information of the base station

Do and Yoo (2016) provide an extensive survey on methods and implementations of visible light based positioning systems Such systems usually consist of base-stations, which emit encoded light to electronic devices, equipped with photo-detection sensors The base stations are either re-used light sources, such as traffic lights or lamps, or dedicated infrared beacons The simplest solution is based on proximity detection Such systems can be implemented with just a photodiode as receiver It is however only possible to detect the base-station itself, so only a very coarse localization with an uncertainty of several meters (Campo-Jimenez, Martin Perandones, & Lopez-Hernandez, 2013) is possible

Finger printing based methods can achieve more precise positioning, but require pre-recorded maps for localization Time difference of arrival (TDOA) is another method, which works by receiving light signals from multiple base-stations For 3D localization, at least three base-stations need to be in the direct field of view The tracked device directly measures the distance to the base-station in this method This is accomplished by measuring the time it takes the light pulses of each base station to travel to the device The position is then determined by trilateration Such localization systems require a good synchroniza-tion of the base stations and the localized device A single photodiode can be used to receive the signals, and a method to separate the received signals of the base-stations has to be employed This is possible

by using time or frequency division multiplexing (Liu, et al., 2014) TDOA localization is not limited to the optical domain It is possible to also use radio or sound waves; however, multi-path effects need to

be considered Image sensors might be capable to measure time-differences, and also offer light-source separation due to the projection via lenses The position on the pixel can be used to detect the angles

Trang 36

between the base-stations Due to the pixel readout process, image sensors usually cannot be sampled

at the required rates It either requires dedicated image sensors, featuring customized electronic shutter units or direct pixel readout Dynamic vision sensors (Censi, Strubel, Brandli, Delbruck, & Scaramuzza, 2013) are a promising development, and might be able to conduct such measurements An active vi-sion sensor does not produce images, but events which describe local pixel value changes The difficult synchronization between device and base-stations might deem such image sensor based approaches unfeasible, because it is also possible to determine the position on triangulation alone Time-of-Flight 3D imaging sensors might be capable to support TDOA based localization, but to our knowledge, this has not yet been investigated and is subject for future research

Another method to determine the distance between optical receivers and LEDs is to measure the received signal strength (RSS) Calculating the distance to light sources is based on modelling the light source and its propagation path to photo sensors After calibration, distances can be associated with the output of optical receivers When receiving the signal from multiple base-stations, the signal differences between the base-stations can be used to calculate the distance This mitigates the influence of background illumination RSS based positioning has the potential to be simply implemented and widely adopted, since no synchronization is necessary The problem however is that the received signal strength depends

on the orientation of the photo detector relative to the LEDs as well The light strength also depends on the orientation of the LEDs, since light is radiated inhomogeneously It is however possible to combine RSS with angle based localization methods (Mauro Biagi, 2015)

Localization based on triangulation requires systems, which are able to measure the angle of arrival (AOA) This can be either accomplished by an array of photodiodes (Lee & Jung, 2012) or by using image sensors AOA systems are in general more complicated, but do not require synchronization With the help of a 3D camera, it is possible to localize the relative positions of the base stations, and combine trilateration and triangulation in order to improve the localization accuracy It is also possible to avoid determining the positions of the base-stations beforehand If only relative movements need to be detected, the base stations can be supplied with their relative locations from the 3D camera system via optical communication In the next section, we present our OptiSec3D approach, which enables these concepts

by combining Time-of-Flight depth sensing with optical communication

LOCATION-AWARE OPTICAL COMMUNICATION

BASED ON TIME-OF-FLIGHT SENSORS

The operating principle of Time-of-Flight depth sensors requires an image sensor, capable of demodulating phase differences of pulsed light In this section, we present our effort to create a novel location-aware optical communication system We further go into detail, how it might benefit future localization and tracking systems in the fields of IoT and cyber physical systems

The most significant feature of image sensor based optical communication is the directional ness of the communication partner If depth imaging sensors are used for optical communication, it is even possible to track communication partners in 3D While there exists a manifold of image sensor based optical communication systems, Time-of-Flight sensors have not yet been widely explored for optical communication

aware-A first attempt was made by Yuan et al (2014), who establish a one-way communication link tween a Time-of-Flight camera and an array of modulated LED lights The sending device avoids the

Trang 37

be-required synchronization by recovering the Time-of-Flight sensor’s modulation signal with a ode The emitting LEDs are supplied with a phase-modulated modulation signal and manipulate depth measurements of the ToF sensor These depth measurements are analyzed and the received information

photodi-is extracted Since the Time-of-Flight sensor photodi-is operated in normal depth sensing mode at relatively low frame-rates in this approach, multiple LEDs are used to transmit information in parallel Such multiple input approaches are limited in range, since the pixel array cannot resolve individual LEDs when a certain distance is exceeded

If a system however is capable of configuring and controlling Time-of-Flight 3D imaging systems with a direct connection and a real-time system with low level configuration access, optical commu-nication parameters can be changed to increase readout speeds and it is also possible to use just single modulated LEDs to send information In our OptiSec3D approach, we utilize Time-of-Flight sensors as optical transceivers, which are also capable to incorporate depth measurements into the communication protocol Our approach has the potential to reach a throughput of several kilobits per second

Operation Principle of the OptiSec3D Approach

Indirect Time-of-Flight sensing works by emitting pulsed infrared light The active illumination unit

of such system can be used as transmitter, since it is designed to emit pulsed infrared light at different phase-shifts The pixels of the receiving Time-of-Flight image sensor are capable to demodulate the phase-shifted signal This allows optical communication based on pulsed light phase-shift keying (PLPSK) The vast advantage of PLPSK is that multiple bits can be encoded in one image In most image sensor based approaches, simple binary modulation schemes, such as on/off keying (Roberts, 2013) or pulse position keying are used They support the transmission of one bit per frame at best PLPSK takes advantage of the photonic mixer device (PMD), located on each pixel of a Time-of-Flight sensor The PMD decodes phase differences of incoming light pulses, by sorting the incoming charges into charge storage buckets

A and B on the pixels After the readout process, the voltage difference of these buckets is proportional

to the phase difference between the own modulation signal and the incoming light pulses If these light pulses are phase-modulated, the output of the ToF sensor contains the decoded phase offset signal Since

at least four equidistant phases are used during communication, it is possible to decode phase differences

by just using one frame, instead of at least four frames used during depth measurement

Channel Characteristics

Due to the measurement principle of the PMD on each pixel, the sensor is sensitive to pulsed light within

a certain frequency range Non-pulsed background light does not have a direct influence on the ment The photons of continuous light arrive during both switching states of the PMD with near equal intensity This fills both charge buckets equally, leading to increased noise but no measurement bias Due to this principle, extensive image processing is unnecessary, as it can be assumed, that all detected signals originates from potential communication partners

measure-Another side-effect of the PMD pixels is that sensors can choose modulation frequencies from a large spectrum to communicate Stray light from different connections from different systems do not directly influence the measurement, as long the light pulse frequency is just several thousand Hertz apart There exist no experimental evaluations so far, but a viable spectrum of 16 to 26 MHz can potentially yield to

Trang 38

1000 different channels, with a rather large distance of 10000 Hz Using fast-switching VCELs, instead

of LEDs, the upper boundary of the spectrum can be extended to over 100 MHz

Image sensor based optical communication usually suffers from the low frame-rates of the sensors While there exist experimental dedicated communication image sensors (Takai, 2013), the necessary digitization of complete frames has been holding back high speed image sensor based communication links so far

Time-of-Flight sensors need to capture up to eight phase images in order to create one depth image Therefore, the readout and analog-digital conversion circuitry is often optimized for fast readout and digitization In order to demodulate a line-of-sight communication signal, just a very short exposure time

is required These characteristics enable high frame-rates during optical communication We manage to operate our Time-of-Flight sensor at 500 frames per second with full sensor readout This however can

be massively increased, if the sensor is only read-out partially By configuring the readout region to a minimum of 16x32 pixels, we are able to reach 7300 frames per second This can lead to a transmission throughput of 14600 bits per second, when using 4-PLPSK This high framerate leads to fast light source tracking capabilities, supported by fast adoption of the readout window, to accommodate moving targets

Sensor Synchronization

A technical challenge throughout many communication systems is synchronization In Time-of-Flight based communication, it is important that the frequency of the modulation signal of a ToF camera matches the frequency of its communication partner If the modulation frequency of a ToF sensor is different from the frequency of the incoming light pulses, the measured phase values start to drift Figure 3 shows sampled phase values, when the sender continuously emits pulsed infrared light without phase-shift If this signal is measured and digitized, the frequency of this signal is the absolute modulation frequency difference between both communication partners

Figure 3 The observed phase with corresponding charge bucket contents, if sender and receiver are not synchronized

Trang 39

Synchronization can be accomplished by sampling the incoming light pulses, and calculating the frequency difference of the communication partner The modulation frequency of Time-of-Flight cameras

is usually configurable to accommodate different use-cases In the case of OptiSec3D, the frequency is adapted by configuring a phase locked loop (PLL) on the sensor Synchronization can be reached, if the PLL is adapted by the measured frequency difference

Implementation of OptiSec3D

The core of each OptiSec3D communication partner is an Infineon Real3TM 3D imaging sensor, based

on Time-of-Flight technology of pmd technologies The Xilinx Zynq 7000 platform is used in our form to operate the sensor with software executed on its integrated ARM processors, while the FPGA

plat-is used as glue logic and for imaging data transmplat-ission The software uses an I2C bus to configure the ToF imaging sensor This live-configuration of the sensor allows changing the internal workings of a normal depth sensor in such way, that it is possible to transmit and decode data This works by limit-ing the number of digitized pixels per frame to a small area around modulated lightsources With this configuration, the sensor is able to sample the image of the communication partner at over 7300 frames per second The received signal directly contains the transmitted decoded information, since ToF pixels are sensitive to phase-shift differences

Communication Modes

Time-of-Flight cameras could be either used to communicate with each other, or with different electronic devices While it is not difficult to implement a PLPSK transmitter, receiving PLPSK is not trivial, since a photonic mixer device is required ToF cameras however are able employ alternative modulation schemes, such as pulse position modulation A receiver would just need to be able to detect the presence

of light within certain time-slots Time-shifts could be implemented either directly by accessing the lumination unit, by varying the frame-rate, or the number of read-out pixels

il-Optical line-of-sight communication is not limited to two communication partners The aforementioned concepts can be employed to multiplex communication between multiple partners For synchronization, the adapted frequency for each communication partner can be stored, and the PLL adapted each time, when switching the focus to a different partner If multiple devices want to communicate with a single node, all other devices could alternatively adjust their frequency to the node If the focus of an applica-tion is on low latency rather than throughput, the sensor can be read-out completely, instead of adapting the readout region to the location of each communication partner

Localization Principles with Location-Aware Optical Communication

Location-awareness and re-usability are the main motivations for optical communication based on image sensors While 2D image sensors are only capable to determine the incident angles of a line-of-sight connection, Time-of-Flight sensors can locate communication partners in 3D

Since Time-of-Flight imaging systems can be used as optical transmitter and receiver, the simplest use case is optical communication between two Time-of-Flight cameras When both communication partners measure their mutual distance, they can both locate each other in 3D In the application of e.g encrypted device authentication, both partners can forward their mutual distance measurement and check

Trang 40

for consistency This effectively defeats relay attacks, where an attacker relays communication without alteration, using two relay boxes In such relay attack, the distance between sender and relay box A is not consistent to the distance of receiver and relay box B

Communication between Time-of-Flight cameras could also be employed in localization solutions

In that case, a stationary camera with a light emitter serves as beacon Electronic devices, equipped with

a Time-of-Flight camera can contact one beacon to determine its position and orientation An example could be an autonomous robot, desiring to navigate around a building

The beacon’s purpose is to forward its own position and the angles relative to the device When simple LED beacons are used, at least three of them are necessary to determine the camera’s position, even when the camera can determine the distance to each beacon If the beacon however features an image sensor, as depicted in Figure 4, the incidence angle of the line-of-sight can be determined by the beacon and forwarded to the device This enables 5-dof localization of embedded devices, using a single beacon The only unknown degree of freedom is the roll angle, since the beacons optical signature is invariant to rotations around this axis Sensor fusion with a gravity sensor or computer vision methods can effectively help reach full 6-dof localization In the case of autonomous robots, this angle might already be locked due to the camera mounting method

While beaconing with Time-of-Flight sensors enables a miniaturized localization system, simpler and cheaper beacons might be desirable If three or more beacons are visible at the same time, a device using an imaging sensor can reach 6-dof localization by solving the perspective-n-point problem When using depth sensors, the relative 3D position between camera and each beacon is directly available This enhances the positioning robustness, since both trilateration and triangulation can be used to determine the position

When no absolute localization within a predetermined coordinate system is desired, it is sufficient

to use beacons which initially do not hold information about their location A device can determine the relative positions of the beacons by using a 3D camera With optical communication, it is possible to assign IDs to each of them and re-localize them in the local coordinate system It is also possible to forward localization information to these beacons, so that simpler devices with 2D cameras can later on use them for navigation In the last section of this chapter, the idea of forwarding positional information

to small IoT devices is further discussed

Augmented Internet of Things

So far, the main focus of augmented reality (AR) is fusing virtual and real worlds in order to receive

an augmented world The main mechanisms so far are putting virtual objects into a live camera stream (e.g Google Tango, Qualcomm Vuforia), or embedding them into the viewport of a user (e.g Microsoft HoloLens) We propose a concept with our OpticSec3D approach, to use 3D location-aware optical com-munication in order to enable embedded devices to interact with the augmented world This would enable electronic devices within the viewport of an AR, to transfer information and interaction possibilities to the augmented world These devices could be any kind of system, requiring human interaction, such as light switches, payment terminals, heating and climate control They could also serve as virtual signs,

or display promotions in supermarkets, or be used for pairing with local WIFI or Bluetooth connections

A Time-of-Flight depth sensor on such AR system can receive optical signals from such devices and can use distance measurements to determine the 3D position When the 3D position relative to the AR device is known, it can be embedded into the augmented reality The concept, of mapping a 3D posi-

Ngày đăng: 09/11/2018, 15:03

TỪ KHÓA LIÊN QUAN

TÀI LIỆU CÙNG NGƯỜI DÙNG

TÀI LIỆU LIÊN QUAN