Deeplearningusesneuralnetworkarchitecturesthatcontainmanyprocessinglayers,includingconvolutionallayers.Deeplearningmodelstypicallyworkonlargesetsoflabeleddata.Performinginferenceonthesemodelsiscomputationallyintensive,consumingsignificantamountofmemory.Neuralnetworksusememorytostoreinputdata,parameters(weights),andactivationsfromeachlayerastheinputpropagatesthroughthenetwork.DeepNeuralnetworkstrainedinMATLABusesingle-precisionfloatingpointdatatypes.Evennetworksthataresmallinsizerequireaconsiderableamountofmemoryandhardwaretoperformthesefloating-pointarithmeticoperations.Theserestrictionscaninhibitdeploymentofdeeplearningmodelstodevicesthathavelowcomputationalpowerandsmallermemoryresources.Byusingalowerprecisiontostoretheweightsandactivations,youcanreducethememoryrequirementsofthenetwork.
ThisexampleshowshowtogenerateC++codeforaconvolutionneuralnetworkthatusestheARMComputeLibraryandperformsinferencecomputationsin8-bitintegers.
ThisexampleisnotsupportedforMATLABOnline.
Third-PartyPrerequisites
SqueezeNethasbeentrainedontheImageNetdatasetcontainingimagesof1000objectcategories.Thenetworkhaslearnedrichfeaturerepresentationsforawiderangeofimages.Thenetworktakesanimageasinputandoutputsalabelfortheobjectintheimagetogetherwiththeprobabilitiesforeachoftheobjectcategories.
Thisexampleconsistsoffoursteps:
Toperformclassificationonanewsetofimages,youmustfine-tuneapretrainedSqueezeNetconvolutionneuralnetworkbyusingtransferlearning.Intransferlearning,youtakeapretrainednetworkanduseitasastartingpointtolearnanewtask.Fine-tuninganetworkbyusingtransferlearningisusuallymuchfasterandeasierthantraininganetworkwithrandomlyinitializedweightsfromscratch.Youcanquicklytransferlearnedfeaturestoanewtaskbyusingasmallernumberoftrainingimages.
LoadthepretrainedSqueezeNetnetwork.
Toretrainapretrainednetworktoclassifynewimages,replacethesetwolayerswithnewlayersadaptedtothenewdataset.YoucandothismanuallyorusethehelperfunctionfindLayersToReplacetofindtheselayersautomatically.
ThisisthefindLayersToReplacehelperfunction:
augimdsValidation=augmentedImageDatastore(inputSize(1:2),imdsValidation);Specifythetrainingoptions.Fortransferlearning,keepthefeaturesfromtheearlylayersofthepretrainednetwork(thetransferredlayerweights).Toslowdownlearninginthetransferredlayers,settheinitiallearningratetoasmallvalue.Inthepreviousstep,youincreasedthelearningratefactorsfortheconvolutionallayertospeeduplearninginthenewfinallayers.Thiscombinationoflearningratesettingsresultsinfastlearningonlyinthenewlayersandslowerlearningintheotherlayers.Whenperformingtransferlearning,youdonotneedtotrainforasmanyepochs.Anepochisafulltrainingcycleontheentiretrainingdataset.Specifythemini-batchsizetobe11sothatineachepochyouconsiderallofthedata.Duringtraining,thesoftwarevalidatesthenetworkaftereveryValidationFrequencyiterations.
cfg.DeepLearningConfig=dlcfg;7.UsetheMATLABSupportPackageforRaspberryPifunction,raspi,tocreateaconnectiontotheRaspberryPi.Inthefollowingcode,replace:
Youcanalsoselectawebsitefromthefollowinglist
HowtoGetBestSitePerformance
SelecttheChinasite(inChineseorEnglish)forbestsiteperformance.OtherMathWorkscountrysitesarenotoptimizedforvisitsfromyourlocation.