Proceedings of the 4th EAI International Conference on Smart Objects and Technologies for Social Good 2018
DOI: 10.1145/3284869.3284893
|View full text |Cite
|
Sign up to set email alerts
|

Enabling Privacy with Transfer Learning for Image Classification DNNs on Mobile Devices

Abstract: More people could benefit of Machine Learning (ML) as an increasingly important technology and service, if state-of-the-art ML techniques with training capability were accessible on personal devices. To this end, we report details on how to deploy Tensor-Flow on off-the-shelf mobile and embedded devices and retrain current deep neural networks for image recognition on-device. Our motivation is to both grant privacy and allow users to efficiently personalize image classifiers for their own needs and purposes, a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
5

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(1 citation statement)
references
References 12 publications
0
1
0
Order By: Relevance
“…Fine grain assessment on environmental and personal factors combined so far have not been used together with machine learning to gain models that can be personalized. Nonetheless, adaption of machine learning models on mobile device is a feasible task [19]. While deep learning is becoming increasingly popular in Affective Computing [24] we rely upon handcrafted features due to the sparse nature of our data-set [23].…”
Section: Introductionmentioning
confidence: 99%
“…Fine grain assessment on environmental and personal factors combined so far have not been used together with machine learning to gain models that can be personalized. Nonetheless, adaption of machine learning models on mobile device is a feasible task [19]. While deep learning is becoming increasingly popular in Affective Computing [24] we rely upon handcrafted features due to the sparse nature of our data-set [23].…”
Section: Introductionmentioning
confidence: 99%