A collection of quickstart samples demonstrating the ML Kit APIs on Android and iOS.
Note: due to how this repo works, we no longer accept pull requests directly. Instead, we'll patch them internally and then sync them out.
A collection of sample apps to demonstrate how to use Google's ML Kit APIs on Android and iOS
License: Apache License 2.0
A collection of quickstart samples demonstrating the ML Kit APIs on Android and iOS.
Note: due to how this repo works, we no longer accept pull requests directly. Instead, we'll patch them internally and then sync them out.
I guess, you miss a CI !
Because Android has several stand alone samples, it's hard to maintain each of them.
As a first step, what's about a single sample, with several libraries ? With this, it's more easy to introduce a CI.
If you like it, I can make a pull request, otherwise I save my time and do nothing.
Hi,
I've cloned repository and run the mlkit-app. Barcode detection feature seems to work fine for simple aztec codes such as short URL etc. but it's having trouble with detecting more complex ones such as:
What are the limitations for aztec codes scanner? Is it can be configured or optimised to detect bigger codes?
I want to use MLKit for barcode scanning in my iOS app where I'm currently using Carthage as a dependency manager. How I can accomplish this?
I was wondering if I could link the MLKit.TextRecognition into my static framework. The idea is to use the Text recognition task as a pre-process of the task exposed by my static framework. Since the MLKit is provided as a cocoapod distribution, I'm not sure this is possible.
I was able to do similar integration by using the FirebaseMLKit.TextRecognition with Carthage distribution and then linking the related Firebase frameworks into my static framework. Obviously these dependencies must be embedded into target app, but it was working pretty well.
Do you guys have any help on this kind of integration?
Hello,
I have been using ML Kit for iOS extensively, particularly the barcode detection functionality. I have run into an issue where barcode detection works for some devices in some orientations and does not work as expected in other cases. I did come across a couple of similar issues that were raised in the past and closed due to inactivity, but I wanted to raise a new issue and provide all of the details around my specific problem.
First off, I have recreated the issue in the quickstart-ios project but only by extending the CameraViewController to include live barcode detection. The still image barcode detection with the UIImagePickerController works for me in all scenarios as it seems to be able to properly rotate the image no matter what orientation it was taken in.
I am using the AVCaptureSession in the CameraViewController to detect barcodes using the live feed. For my specific use case, however, I am using the AVCaptureSession "photo" preset. Everything works as expected for the iPhone 7 Plus and iPhone 8 Plus (and I believe probably any other iPhone that has two cameras). When holding the phone in portrait mode these devices are able to detect barcodes very well. However, if using an iPhone SE, iPhone 7, iPhone 8 (and potentially and other single camera iPhone?) barcodes are only detected in the landscape orientation.
It gets even more interesting though because if I change the AVCaptureSession preset to "medium" or "high" the barcode detection starts working the same (and as expected) for all devices I've tested on. Only when using the "photo" preset do I see an inconsistency in functionality. It seems like the image orientation metadata has absolutely no affect when using the "photo" preset because I tried every possible value for the orientation metadata and the functionality did not change; still worked as expected for iPhone Plus devices, and not as expected for single camera devices.
If helpful I can provide my CameraViewController.swift where I recreate the issue for live feed barcode detection.
Any help or guidance on this issue would be greatly appreciated!
Thank you.
Platform: Android
Error occurs when feed InputImage from Camera2 by using InputImage.fromMediaImage() into FaceDetector.
com.google.mlkit.common.MlKitException: Internal error has occurred when executing ML Kit tasks at com.google.mlkit.common.sdkinternal.ModelResource.zza(com.google.mlkit:common@@16.0.0:28) at com.google.mlkit.common.sdkinternal.zzn.call(Unknown Source:6) at com.google.mlkit.common.sdkinternal.zzm.run(com.google.mlkit:common@@16.0.0:5) at com.google.mlkit.common.sdkinternal.zzq.run(com.google.mlkit:common@@16.0.0:3) at android.os.Handler.handleCallback(Handler.java:883) at android.os.Handler.dispatchMessage(Handler.java:100) at com.google.android.gms.internal.mlkit_common.zzb.dispatchMessage(com.google.mlkit:common@@16.0.0:6) at android.os.Looper.loop(Looper.java:359) at android.os.HandlerThread.run(HandlerThread.java:67) Caused by: java.lang.IllegalStateException: Image is already closed at android.media.Image.throwISEIfImageIsInvalid(Image.java:72) at android.media.ImageReader$SurfaceImage$SurfacePlane.getBuffer(ImageReader.java:965) at com.google.android.gms.vision.Frame$Builder.setPlanes(com.google.android.gms:play-services-vision-common@@19.1.0:14) at com.google.mlkit.vision.face.internal.zza.zza(com.google.android.gms:play-services-mlkit-face-detection@@16.0.0:56) at com.google.mlkit.vision.face.internal.zza.zza(com.google.android.gms:play-services-mlkit-face-detection@@16.0.0:85) at com.google.mlkit.vision.face.internal.zza.run(com.google.android.gms:play-services-mlkit-face-detection@@16.0.0:164) at com.google.mlkit.vision.common.internal.MobileVisionBase.zza(com.google.mlkit:vision-common@@16.0.0:23) at com.google.mlkit.vision.common.internal.zzb.call(Unknown Source:4) at com.google.mlkit.common.sdkinternal.ModelResource.zza(com.google.mlkit:common@@16.0.0:26) at com.google.mlkit.common.sdkinternal.zzn.call(Unknown Source:6)ย at com.google.mlkit.common.sdkinternal.zzm.run(com.google.mlkit:common@@16.0.0:5)ย at com.google.mlkit.common.sdkinternal.zzq.run(com.google.mlkit:common@@16.0.0:3)ย at android.os.Handler.handleCallback(Handler.java:883)ย at android.os.Handler.dispatchMessage(Handler.java:100)ย at com.google.android.gms.internal.mlkit_common.zzb.dispatchMessage(com.google.mlkit:common@@16.0.0:6)ย at android.os.Looper.loop(Looper.java:359)ย at android.os.HandlerThread.run(HandlerThread.java:67)ย
In contrast, If convert preview frame to bitmap then create InputImage by using fromBitmap, everything works fine.
I'm using Face Detection ml kit library to detect face contour and its lips. So I'm using this configuration:
val options = FirebaseVisionFaceDetectorOptions.Builder()
.setMinFaceSize(0.15f)
.setPerformanceMode(FirebaseVisionFaceDetectorOptions.ACCURATE) // I need more then 1 face
.setContourMode(FirebaseVisionFaceDetectorOptions.ALL_CONTOURS) // I need contours of faces
.enableTracking()
.setLandmarkMode(FirebaseVisionFaceDetectorOptions.NO_LANDMARKS)
.setClassificationMode(FirebaseVisionFaceDetectorOptions.NO_CLASSIFICATIONS)
.build()
I'm able to get more then 1 face (3 in my example image) with boundingBox and trackingId, but countours are populated only for one face. All others are just empty.
Documentation says that ALL_CONTOURS: Detects FirebaseVisionFaceContour for a given face. Note that it would return contours for up to 5 faces
. So I expect that ml kit will return contours for up to 5 faces, but not the only 1 of them.
Code:
let vision = Vision.vision()
let textRecognizer = vision.onDeviceTextRecognizer()
let imageMetadata = VisionImageMetadata()
imageMetadata.orientation = GlobalUtils.visionImageOrientation(from:textimageCorpped.imageOrientation)
let visionImage = VisionImage(image: textimageCorpped)
visionImage.metadata = imageMetadata
textRecognizer.process(visionImage) { text, error in
guard error == nil, let text = text else {
completionBlock(false,"",nil)
return
}
let currentText = text.text
completionBlock(true,currentText,text)
}
The code works fine with version 0.19.0 of FirebaseMLVisionTextModel(firebase version : 6.24.0). upgrade to version 0.20.0 of FirebaseMLVisionTextModel(firebase version:6.27.0), Xcode report error: -[GMVTextDetector initWith:size:angle:]: unrecognized selector sent to instance 0x60000066aa70
Code:
let vision = Vision.vision()
let options = VisionFaceDetectorOptions()
options.contourMode = .all
options.performanceMode = .accurate
let faceDetector = vision.faceDetector(options: options)
let image = VisionImage(image: $0)
faceDetector.process(image) { faces, error in
guard error == nil, let faces = faces, !faces.isEmpty else { return }
for face in faces {
if face.contour(ofType: .face) != nil {
// do something
}
}
}
Xcode warning:
+[FBMLx_CCTClearcutUploader crashIfNecessary] Multiple instances of CCTClearcutUploader were instantiated. Multiple uploaders function correctly but have an adverse affect on battery performance due to lock contention.
Helo,
In Android / material-showcase project, the dependencies are outdated.
Mobile vision will be replaced with Firebase ML
https://developers.google.com/vision/android/barcodes-overview
Also Camera 1 is deprecated..
Why don't use Firebase ML Vision and Jetpack Camera?
I've tried to run this sample without any modifications with the latest Firebase version available (6.5.0) and also tried few older versions. When I select 'live' text recognition I see in the xCode Instruments Profiler that there is some memory leak which causing App to consume memory up to 10 Gb (depends on device, I used iPhone 6 iOS 10.3.2) and then crashes. It took about 13 minutes to reproduce the crash on iPhone 6.
Dear Team,
OCR reader(text recognizer) is working fine in english language but not recognized in tamil language may be some of the other languages also not working.
How to solve this issue. please provide the solution.
My code is:
private fun recognizeText(image: InputImage) {
val recognizer = TextRecognition.getClient()
val identification = LanguageIdentification.getClient()
val result = recognizer.process(image)
.addOnSuccessListener { visionText: Text ->
processTextBlock(visionText)
Log.e(TAG, "Success")
}
.addOnFailureListener { e ->
Log.e(TAG, "${e.message}")
}}
private fun processTextBlock(result: Text) {
val resultText = result.text
var lineText: String = ""
for (block in result.textBlocks) {
val blockText = block.text
Log.e("Block text", "$blockText")
for (line in block.lines) {
lineText = lineText + "\n" + line.text
Log.e("Line text", "$lineText")
}
}
}
Photo captured in landscape mode that time recognizer not provided any output(actually text recognizer only working in camera zero angle remaining three angle not working 90,180,270) but portrait mode it's working fine how to solve this issue. please provide the solution in as soon as possible.
I mention my complete code on below
private fun callGallery() {
val intent = Intent()
intent.type = "image/*"
intent.action = Intent.ACTION_GET_CONTENT
startActivityForResult(
Intent.createChooser(intent, "Select gallery"),
AppConstant.GALLERY_REQUEST_CODE
)
}
private fun callCamera() {
val intent = Intent(MediaStore.ACTION_IMAGE_CAPTURE)
if (intent.resolveActivity(packageManager) != null) {
val content = ContentValues()
content.put(MediaStore.Images.Media.TITLE, "New pictures")
content.put(MediaStore.Images.Media.DESCRIPTION, "From camera")
uri = contentResolver.insert(MediaStore.Images.Media.EXTERNAL_CONTENT_URI, content)
intent.putExtra(MediaStore.EXTRA_OUTPUT, uri)
startActivityForResult(intent, AppConstant.CAMERA_REQUEST_CODE)
} else {
UiUtils.showErrorLog(TAG, "Camera not available")
}
}
override fun onActivityResult(requestCode: Int, resultCode: Int, data: Intent?) {
super.onActivityResult(requestCode, resultCode, data)
if (requestCode == AppConstant.GALLERY_REQUEST_CODE && resultCode == Activity.RESULT_OK) {
uri = data!!.data
imageFromPath(this, uri!!)
ivActualIcon.setImageURI(uri)
} else if (requestCode == AppConstant.CAMERA_REQUEST_CODE && resultCode == Activity.RESULT_OK) {
imageFromPath(this, uri!!)
ivActualIcon.setImageURI(uri)
} else {
UiUtils.showErrorLog(TAG, "result empty")
}
}
private fun imageFromPath(context: Context, uri: Uri) {
// [START image_from_path]
val image: InputImage
try {
image = InputImage.fromFilePath(context, uri)
recognizeText(image)
} catch (e: IOException) {
e.printStackTrace()
}
}
private fun recognizeText(image: InputImage) {
val recognizer = TextRecognition.getClient()
val result = recognizer.process(image)
.addOnSuccessListener { visionText: Text ->
processTextBlock(visionText)
UiUtils.showErrorLog(TAG, "Success")
}
.addOnFailureListener { e ->
UiUtils.showErrorLog(TAG, "${e.message}")
}
}
private fun processTextBlock(result: Text) {
val resultText = result.text
var lineText: String = ""
for (block in result.textBlocks) {
val blockText = block.text
UiUtils.showErrorLog("Block text", "$blockText")
for (line in block.lines) {
lineText = lineText + "\n" + line.text
UiUtils.showErrorLog("Line text", "$lineText")
}
}
}
This bug is meant to track all of the other bugs about making the MLKit camera samples perform better, so that it can be discussed in one place.
[9-6810000024923] onDevice Barcode Scanning Library Crash
App is crashing after scanning some QR code like https://appsys.com.pk/error-qr-code.jpg
it happens in all version
2019-01-17 13:22:28.746 12032-12032/? A/DEBUG: *** *** *** *** *** *** *** *** *** *** *** *** *** *** *** ***
2019-01-17 13:22:28.747 12032-12032/? A/DEBUG: Build fingerprint: 'Nokia/TA-1032_00WW/NE1:8.1.0/O11019/00WW_4_42D:user/release-keys'
2019-01-17 13:22:28.747 12032-12032/? A/DEBUG: Revision: '0'
2019-01-17 13:22:28.747 12032-12032/? A/DEBUG: ABI: 'arm64'
2019-01-17 13:22:28.747 12032-12032/? A/DEBUG: pid: 11888, tid: 11970, name: FirebaseMLHandl >>> com.google.firebase.samples.apps.mlkit <<<
2019-01-17 13:22:28.747 12032-12032/? A/DEBUG: signal 11 (SIGSEGV), code 1 (SEGV_MAPERR), fault addr 0x0
2019-01-17 13:22:28.747 12032-12032/? A/DEBUG: Cause: null pointer dereference
2019-01-17 13:22:28.747 12032-12032/? A/DEBUG: x0 0000007247abdc60 x1 0000000000000000 x2 0000000000000018 x3 0000000000000002
2019-01-17 13:22:28.747 12032-12032/? A/DEBUG: x4 0000000000000000 x5 00000072388e48fc x6 0000000000000054 x7 000000000000004c
2019-01-17 13:22:28.747 12032-12032/? A/DEBUG: x8 77e95c715d0a436a x9 77e95c715d0a436a x10 0000000000000001 x11 0000000000000000
2019-01-17 13:22:28.748 12032-12032/? A/DEBUG: x12 00000000000000bd x13 000000000000000d x14 0000007247a00000 x15 aaaaaaaaaaaaaaab
2019-01-17 13:22:28.748 12032-12032/? A/DEBUG: x16 00000072d588b2d0 x17 00000072d582bd68 x18 00000072d5896000 x19 0000007247abdc78
2019-01-17 13:22:28.748 12032-12032/? A/DEBUG: x20 0000007247abdc60 x21 00000072388e4928 x22 0000000000000000 x23 00000072388e49c0
2019-01-17 13:22:28.748 12032-12032/? A/DEBUG: x24 0000000000000001 x25 00000072388e48c8 x26 0000000000000005 x27 00000072388e4868
2019-01-17 13:22:28.748 12032-12032/? A/DEBUG: x28 00000072388e48b0 x29 00000072388e5478 x30 0000007237a6013c
2019-01-17 13:22:28.748 12032-12032/? A/DEBUG: sp 00000072388e4780 pc 0000007237a54e48 pstate 0000000000000000
2019-01-17 13:22:28.751 1823-11990/? E/libc: Access denied finding property "camera.dumpbuffer.enable"
2019-01-17 13:22:28.756 12032-12032/? A/DEBUG: backtrace:
2019-01-17 13:22:28.756 12032-12032/? A/DEBUG: #00 pc 0000000000006e48 /data/data/com.google.android.gms/app_vision/barcode/libs/arm64-v8a/libbarhopper.so
2019-01-17 13:22:28.756 12032-12032/? A/DEBUG: firebase/quickstart-android#1 pc 0000000000012138 /data/data/com.google.android.gms/app_vision/barcode/libs/arm64-v8a/libbarhopper.so
2019-01-17 13:22:28.756 12032-12032/? A/DEBUG: firebase/quickstart-android#2 pc 000000000002aa40 /data/data/com.google.android.gms/app_vision/barcode/libs/arm64-v8a/libbarhopper.so
2019-01-17 13:22:28.756 12032-12032/? A/DEBUG: firebase/quickstart-android#3 pc 000000000002b990 /data/data/com.google.android.gms/app_vision/barcode/libs/arm64-v8a/libbarhopper.so
2019-01-17 13:22:28.756 12032-12032/? A/DEBUG: firebase/quickstart-android#4 pc 0000000000012c1c /data/data/com.google.android.gms/app_vision/barcode/libs/arm64-v8a/libbarhopper.so
2019-01-17 13:22:28.756 12032-12032/? A/DEBUG: firebase/quickstart-android#5 pc 000000000000775c /data/data/com.google.android.gms/app_vision/barcode/libs/arm64-v8a/libbarhopper.so
2019-01-17 13:22:28.756 12032-12032/? A/DEBUG: firebase/quickstart-android#6 pc 0000000000003624 /data/data/com.google.android.gms/app_vision/barcode/libs/arm64-v8a/libbarhopper.so
2019-01-17 13:22:28.757 12032-12032/? A/DEBUG: firebase/quickstart-android#7 pc 0000000000004c8c /data/data/com.google.android.gms/app_vision/barcode/libs/arm64-v8a/libbarhopper.so
2019-01-17 13:22:28.757 12032-12032/? A/DEBUG: firebase/quickstart-android#8 pc 0000000000101b08 /data/user_de/0/com.google.android.gms/app_chimera/m/00000036/oat/arm64/DynamiteModulesA.odex (offset 0x6f000)
source code is available at https://appsys.com.pk/crash-report.zip
and is also crashing on ML Kit Sample App
Describe your environment
Describe the problem:
I'm trying to add the MLKit vision, for barcode scanning), to my application which is made to support API 19 and up. I've tried the mlkit demo app to test if it can do what I need. On devices with API 21 and up everything works fine.
On the API 19 device however I cant scan any other barcode other than a QR code. I'm not getting any errors and my log just tells me D/MIDemoApp:CameraSource: Process an image
I've tried different types of barcodes with the same result, I've tried using the FirebaseVisionBarcodeDetectorOptions and set it to only read code 128 but still nothing.
Hi all,
The native library libart.so used by ML is crashing. As far as I know, there are 2 ways to crash:
Both are the used of NewStringUTF without check to string before?
Please find below the 2 crash stacks. Both are generated with Vision API but it is excatly the same with ML sample application and I think this repository will be more followed than old vision API.
As a reminder, the link to the original issue from googlesamples/android-vision repo:
googlesamples/android-vision#221
The crash due to invalid char in the qrcode
06-19 09:10:17.459 30848-31438/ A/art: art/runtime/java_vm_ext.cc:410] JNI DETECTED ERROR IN APPLICATION: input is not valid Modified UTF-8: illegal start byte 0xae
art/runtime/java_vm_ext.cc:410] string: 'Test Code with ๏ฟฝ'
art/runtime/java_vm_ext.cc:410] in call to NewStringUTF
06-19 09:10:17.460 30848-31438/ A/art: art/runtime/java_vm_ext.cc:410] from com.google.android.gms.vision.barcode.internal.NativeBarcode[] com.google.android.gms.vision.barcode.internal.NativeBarcodeDetector.recognizeNative(int, int, byte[], com.google.android.gms.vision.barcode.internal.NativeBarcodeDetector$NativeOptions)
art/runtime/java_vm_ext.cc:410] "Thread-1397" prio=5 tid=25 Runnable
art/runtime/java_vm_ext.cc:410] | group="main" sCount=0 dsCount=0 obj=0x12d0a0a0 self=0xab8afee0
art/runtime/java_vm_ext.cc:410] | sysTid=31438 nice=0 cgrp=default sched=0/0 handle=0xd6914930
art/runtime/java_vm_ext.cc:410] | state=R schedstat=( 6929244074 23452034 473 ) utm=687 stm=5 core=0 HZ=100
art/runtime/java_vm_ext.cc:410] | stack=0xd6812000-0xd6814000 stackSize=1038KB
art/runtime/java_vm_ext.cc:410] | held mutexes= "mutator lock"(shared held)
art/runtime/java_vm_ext.cc:410] native: #00 pc 0035cecd /system/lib/libart.so (_ZN3art15DumpNativeStackERNSt3__113basic_ostreamIcNS0_11char_traitsIcEEEEiPKcPNS_9ArtMethodEPv+116)
art/runtime/java_vm_ext.cc:410] native: firebase/quickstart-android#1 pc 0033d9a3 /system/lib/libart.so (_ZNK3art6Thread4DumpERNSt3__113basic_ostreamIcNS1_11char_traitsIcEEEE+138)
art/runtime/java_vm_ext.cc:410] native: firebase/quickstart-android#2 pc 0024f6a1 /system/lib/libart.so (_ZN3art9JavaVMExt8JniAbortEPKcS2_+760)
art/runtime/java_vm_ext.cc:410] native: firebase/quickstart-android#3 pc 0024fd3f /system/lib/libart.so (_ZN3art9JavaVMExt9JniAbortVEPKcS2_St9__va_list+54)
art/runtime/java_vm_ext.cc:410] native: firebase/quickstart-android#4 pc 000fc1b3 /system/lib/libart.so (_ZN3art11ScopedCheck6AbortFEPKcz+30)
art/runtime/java_vm_ext.cc:410] native: firebase/quickstart-android#5 pc 00101c71 /system/lib/libart.so (_ZN3art11ScopedCheck5CheckERNS_18ScopedObjectAccessEbPKcPNS_12JniValueTypeE.constprop.95+8096)
art/runtime/java_vm_ext.cc:410] native: firebase/quickstart-android#6 pc 001088e7 /system/lib/libart.so (_ZN3art8CheckJNI12NewStringUTFEP7_JNIEnvPKc+374)
The crash due to camera getParameters()
06-19 09:05:06.588 11732-11732/ A/art: art/runtime/java_vm_ext.cc:470] JNI DETECTED ERROR IN APPLICATION: input is not valid Modified UTF-8: illegal continuation byte 0x17
art/runtime/java_vm_ext.cc:470] string: '๏ฟฝ๏ฟฝลถ๏ฟฝ๏ฟฝลถ๏ฟฝ๏ฟฝลถ๏ฟฝ๏ฟฝลถ=๏ฟฝ๏ฟฝลถ๏ฟฝ๏ฟฝลถ;๏ฟฝ๏ฟฝลถ๏ฟฝ๏ฟฝลถ๏ฟฝ๏ฟฝลถ=;ae-bracket-hdr=Off;ae-bracket-hdr-values=Off,AE-Bracket;anti-shake=0;antibanding=50hz;antibanding-values=off,60hz,50hz,auto;auto-exposure-lock=false;auto-exposure-lock-supported=true;auto-whitebalance-lock=false;auto-whitebalance-lock-supported=true;brightness-step=1;camera-mode=0;contrast=5;contrast-step=1;denoise=denoise-on;denoise-values=denoise-off,denoise-on;dis=disable;dis-values=enable,disable;dual_mode=0;dualrecording-hint=0;effect=none;effect-values=none,mono,negative,solarize,sepia,posterize,whiteboard,blackboard,aqua,emboss,sketch,neon;exif_exptime=0;exif_iso=0;exposure-compensation=0;exposure-compensation-step=0.5;face-detection=off;face-detection-values=off,on;face-recognition=off;face-recognition-values=off,on;fast-fps-mode=0;firmware-mode=none;flash-mode=off;flash-mode-values=off,auto,on,torch;flip-mode-values=off,flip-v,flip-h,flip-vh;fnumber-value-denominator=100;fnumber-value-numerator=220;focal-length=4.80
art/runtime/java_vm_ext.cc:470] ;intelligent-mode=0;iso=auto;iso-values=auto,ISO_HJR,100,200,400,800,1600;jpeg-quality=96;jpeg-thumbnail-height=288;jpeg-thumbnail-quality=85;jpeg-thumbnail-size-values=512x288,480x288,256x154,432x288,512x384,352x288,320x240,176x144,0x0;jpeg-thumbnail-width=512;lensshade=enable;lensshade-values=enable,disable;llv_mode=0;luma-adaptation=3;max-brightness=6;max-contrast=10;max-exposure-compensation=4;max-num-detected-faces-hw=10;max-num-detected-faces-sw=10;max-num-focus-areas=1;max-num-metering-areas=10;max-saturation=10;max-sce-factor=100;max-sharpness=36;max-zoom=63;maxaperture-value-denominator=100;maxaperture-value-numerator=228;mce=enable;mce-values=enable,disable;metering=center;metering-areas=(0,0,0,0,0);metering-values=matrix,center,spot;min-brightness=0;min-contrast=0;min-exposure-compensation=-4;min-saturation=0;min-sce-factor=-100;min-sharpness=0;num-snaps-per-shutter=1;picture-format=jpeg;picture-format-values=jpeg,bayer-qcom-10gbrg,bayer-qcom-10grbg,bayer-qcom-10rgg
art/runtime/java_vm_ext.cc:470] input: '0x01 0x18 0xc5 0xb6 0x17 0x18 0xc5 0xb6 0xe1 <0x17> 0xc5 0xb6 0xe5 0x17 0xc5 0xb6 0x3d 0xe1 0x17 0xc5 0xb6 0xe5 0x17 0xc5 0xb6 0x3b 0x17 0x18 0xc5 0xb6 0xe1 0x17 0xc5 0xb6 0xe5 0x17 0xc5 0xb6 0x3d 0x3b 0x61 0x65 0x2d 0x62 0x72 0x61 0x63 0x6b 0x65 0x74 0x2d 0x68 0x64 0x72 0x3d 0x4f 0x66 0x66 0x3b 0x61 0x65 0x2d 0x62 0x72 0x61 0x63 0x6b 0x65 0x74 0x2d 0x68 0x64 0x72 0x2d 0x76 0x61 0x6c 0x75 0x65 0x73 0x3d 0x4f 0x66 0x66 0x2c 0x41 0x45 0x2d 0x42 0x72 0x61 0x63 0x6b 0x65 0x74 0x3b 0x61 0x6e 0x74 0x69 0x2d 0x73 0x68 0x61 0x6b 0x65 0x3d 0x30 0x3b 0x61 0x6e 0x74 0x69 0x62 0x61 0x6e 0x64 0x69 0x6e 0x67 0x3d 0x35 0x30 0x68 0x7a 0x3b 0x61 0x6e 0x74 0x69 0x62 0x61 0x6e 0x64 0x69 0x6e 0x67 0x2d 0x76 0x61 0x6c 0x75 0x65 0x73 0x3d 0x6f 0x66 0x66 0x2c 0x36 0x30 0x68 0x7a 0x2c 0x35 0x30 0x68 0x7a 0x2c 0x61 0x75 0x74 0x6f 0x3b 0x61 0x75 0x74 0x6f 0x2d 0x65 0x78 0x70 0x6f 0x73 0x75 0x72 0x65 0x2d 0x6c 0x6f 0x63 0x6b 0x3d 0x66 0x61 0x6c 0x73 0x65 0x3b 0x61 0x75 0x74 0x6f 0x2d 0x65 0x
art/runtime/java_vm_ext.cc:470] in call to NewStringUTF
art/runtime/java_vm_ext.cc:470] from java.lang.String android.hardware.Camera.native_getParameters()
This phone cannot download the model no matter how many times it is triggered.
I can't download it when I restore my phone.
Other brands are currently in perfect use.
com.google.android.gms.dynamite.DynamiteModule$LoadingException: No acceptable module found. Local version is 0 and remote version is 0.
at com.google.android.gms.dynamite.DynamiteModule.load(Unknown Source:8)
at com.google.android.gms.internal.vision.zzm.zzq(Unknown Source:28)
at com.google.android.gms.internal.vision.zzm.isOperational(Unknown Source:9)
at com.google.android.gms.vision.barcode.BarcodeDetector.isOperational(Unknown Source:23)
at com.google.android.gms.internal.firebase_ml.zzjl.zza(Unknown Source:23)
at com.google.android.gms.internal.firebase_ml.zzjl.zza(Unknown Source:42)
at com.google.android.gms.internal.firebase_ml.zzij.call(Unknown Source:3)
at com.google.android.gms.internal.firebase_ml.zzie.zza(Unknown Source:29)
at com.google.android.gms.internal.firebase_ml.zzif.run(Unknown Source:2)
at android.os.Handler.handleCallback(Handler.java:790)
at android.os.Handler.dispatchMessage(Handler.java:99)
at com.google.android.gms.internal.firebase_ml.zze.dispatchMessage(Unknown Source:6)
at android.os.Looper.loop(Looper.java:164)
at android.os.HandlerThread.run(HandlerThread.java:65)
2019-02-15 08:55:52.810 3546-3546/? E/BarcodeScanProc: Barcode detection failed com.google.firebase.ml.common.FirebaseMLException: Waiting for the barcode detection model to be downloaded. Please wait.
How do I use it on my ASUS phone?
App crashes on detection.
kotlin.KotlinNullPointerException
at com.google.firebase.samples.apps.mlkit.kotlin.VisionProcessorBase$detectInVisionImage$1.onSuccess(VisionProcessorBase.kt:98)
at com.google.android.gms.tasks.zzn.run(Unknown Source:4)
at android.os.Handler.handleCallback(Handler.java:789)
at android.os.Handler.dispatchMessage(Handler.java:98)
at android.os.Looper.loop(Looper.java:164)
at android.app.ActivityThread.main(ActivityThread.java:6541)
at java.lang.reflect.Method.invoke(Native Method)
at com.android.internal.os.Zygote$MethodAndArgsCaller.run(Zygote.java:240)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:767)
As stacktrace says,
https://github.com/firebase/quickstart-android/blob/master/mlkit/app/src/main/java/com/google/firebase/samples/apps/mlkit/kotlin/VisionProcessorBase.kt#L98
metadata!!,
crashes
These are results of detector. String 1209-002558GK28E21060524860000321 is correct
1209-002558GK28E21060524860000321
1209-002558GK28E21060524860000321
1209-, 9ZGK18E210523242150321
1209-002558GK28E21060524860000321
1209-002558GK28E21060524860000321
3410702717477
1209-002558GK28E21060524860000321
1209-002558GK28E21060524860000321
1209-002558GK28E21060524860000321
1209-, 9ZGK18E210523242150321
1209-002558GK28E21060524860000321
1209-002558GK28E21060524860000321
1209-002558GK28E21060524860000321
1209-002558GK28E21060524860000321
4309-002558GK28E21060524860000321
1209-002558GK28E21060524860000321
1209-002558GK28E21060524860000321
1209-002558GK28E21060524860000321
1209-002558GK28E21060524860000321
1209-002558GK28E21060524860000321
1209-002558GK28E21060524860000321
1209-, 9ZGK18E210523242150321
1209-002558GK28E21060524860000321
1209-002558GK28E21060524860000321
1209-002558GK28E21060524860000321
1209-002558GK28E21060524860000321
1209-002558GK28E21060524860000321
1209-, 9ZGK28EO1060524860000321
1209-002558GK28E21060524860000321
2118516866763
1209-002558GK28E21060524860000321
1209-002558GK28E21060524860000321
1209-002558DK28E2,*'P6$#5
1209-002558GK28E21060524860000321
1209-002558GK28E21060524860000321
4309-, 9ZGK28E21060522760000321
1209-002558GK28E21060524860000321
5411302311520
After installing pods, running the Swift translate example project I almost immediately get the following crash, before any user interaction:
2020-06-06 13:58:06.597488+0200 TranslateExample[23887:3153991] *** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '*** -[__NSPlaceholderDictionary initWithObjects:forKeys:count:]: attempt to insert nil object from objects[3]'
Screenshot with stack trace and simulator:
Every time I take a photo or select and image from assets (hardcoded the image) and try to find text inside with the onDeviceTextRecognizer
, the app crashes.
It works fine on debug mode but in release always crashes.
I followed from here:
https://github.com/firebase/quickstart-ios/blob/master/mlvision/MLVisionExampleObjc/ViewController.m
Any advice appreciated.
Podfile.lock:
- Firebase/Analytics (6.3.0):
- Firebase/Core
- Firebase/Core (6.3.0):
- Firebase/CoreOnly
- FirebaseAnalytics (= 6.0.2)
- Firebase/CoreOnly (6.3.0):
- FirebaseCore (= 6.0.3)
- Firebase/MLVision (6.3.0):
- Firebase/CoreOnly
- FirebaseMLVision (~> 0.16.0)
- Firebase/MLVisionAutoML (6.3.0):
- Firebase/CoreOnly
- FirebaseMLVisionAutoML (~> 0.16.0)
- Firebase/MLVisionTextModel (6.3.0):
- Firebase/CoreOnly
- FirebaseMLVisionTextModel (~> 0.16.0)
Device
# Date: 2019-07-02T04:34:00Z
# OS Version: 12.1.4 (16D57)
# Device: iPhone X
# RAM Free: 1.8%
# Disk Free: 47.5%
crash_info_entry_1
abort() called
crash_info_entry_2
MiSuburbano(27929,0x106ddab80) malloc: *** error for object 0x10b347840: pointer being freed was not allocated
Stack trace
#0. Crashed: com.apple.main-thread
0 libsystem_kernel.dylib 0x213f7f104 __pthread_kill + 8
1 libsystem_pthread.dylib 0x213fff948 pthread_kill$VARIANT$armv81 + 296
2 libsystem_c.dylib 0x213ed6d78 abort + 140
3 libsystem_malloc.dylib 0x213fd3768 _malloc_put + 570
4 libsystem_malloc.dylib 0x213fd3924 malloc_report + 64
5 libsystem_malloc.dylib 0x213fc62d4 free + 376
6 libc++.1.dylib 0x213583d28 std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> >::__grow_by_and_replace(unsigned long, unsigned long, unsigned long, unsigned long, unsigned long, unsigned long, char const*) + 248
7 libc++.1.dylib 0x213583208 std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> >::assign(char const*, unsigned long) + 108
8 libc++.1.dylib 0x213583188 std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> >::operator=(std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > const&) + 56
9 MiSuburbano 0x101d506e4 GMVx_farmhash::Fingerprint128(char const*, unsigned long) + 4760932
10 MiSuburbano 0x102031698 GMVx_farmhash::Fingerprint128(char const*, unsigned long) + 7779608
11 MiSuburbano 0x101940d8c GMVx_farmhash::Fingerprint128(char const*, unsigned long) + 502796
12 MiSuburbano 0x101940b00 GMVx_farmhash::Fingerprint128(char const*, unsigned long) + 502144
13 MiSuburbano 0x1018c8180 +[GMVDetector detectorForClassNamed:options:] + 8192
14 MiSuburbano 0x1018c8044 +[GMVDetector detectorOfType:options:] + 7876
15 MiSuburbano 0x101423298 -[FIRVisionTextRecognizer initWithApp:logger:options:type:detector:cloudVisionService:] + 8847848
16 MiSuburbano 0x101422ca8 +[FIRVisionTextRecognizer textRecognizerForApp:logger:options:type:detector:cloudVisionService:] + 8846328
17 MiSuburbano 0x101422b48 +[FIRVisionTextRecognizer textRecognizerForApp:logger:options:type:] + 8845976
18 MiSuburbano 0x1014228b0 +[FIRVisionTextRecognizer onDeviceTextRecognizerForApp:logger:] + 8845312
19 MiSuburbano 0x1014128a8 -[FIRVision onDeviceTextRecognizer] + 8779768
20 MiSuburbano 0x100f6c260 -[VisionImagePresenter firebaseImage:response:] + 34 (VisionImagePresenter.m:34)
21 MiSuburbano 0x100f4184c -[YourBalanceViewController firebaseImage] + 184 (YourBalanceViewController.m:184)
22 UIKitCore 0x240f4d990 -[UIImagePickerController _imagePickerDidCompleteWithInfo:] + 128
23 PhotoLibrary 0x227766148 PLNotifyImagePickerOfImageAvailability
24 CameraUI 0x23202ad84 initPLNotifyImagePickerOfImageAvailability
25 CameraUI 0x23202a728 -[CAMImagePickerCameraViewController _handleCapturedImagePickerPhotoWithCropOverlayOutput:]
26 AssetsLibraryServices 0x221dcb78c __pl_dispatch_sync_block_invoke + 36
27 libdispatch.dylib 0x213e22484 _dispatch_client_callout + 16
28 libdispatch.dylib 0x213e0281c _dispatch_async_and_wait_invoke + 92
29 libdispatch.dylib 0x213e22484 _dispatch_client_callout + 16
30 libdispatch.dylib 0x213e01b34 _dispatch_main_queue_callback_4CF$VARIANT$armv81 + 1012
31 CoreFoundation 0x214379ce4 __CFRUNLOOP_IS_SERVICING_THE_MAIN_DISPATCH_QUEUE__ + 12
32 CoreFoundation 0x214374bac __CFRunLoopRun + 1964
33 CoreFoundation 0x2143740e0 CFRunLoopRunSpecific + 436
34 GraphicsServices 0x2165ed584 GSEventRunModal + 100
35 UIKitCore 0x2415c4c00 UIApplicationMain + 212
36 MiSuburbano 0x101135190 main + 33 (main.mm:33)
37 libdyld.dylib 0x213e32bb4 start + 4
--
#0. Crashed: com.apple.main-thread
0 libsystem_kernel.dylib 0x213f7f104 __pthread_kill + 8
1 libsystem_pthread.dylib 0x213fff948 pthread_kill$VARIANT$armv81 + 296
2 libsystem_c.dylib 0x213ed6d78 abort + 140
3 libsystem_malloc.dylib 0x213fd3768 _malloc_put + 570
4 libsystem_malloc.dylib 0x213fd3924 malloc_report + 64
5 libsystem_malloc.dylib 0x213fc62d4 free + 376
6 libc++.1.dylib 0x213583d28 std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> >::__grow_by_and_replace(unsigned long, unsigned long, unsigned long, unsigned long, unsigned long, unsigned long, char const*) + 248
7 libc++.1.dylib 0x213583208 std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> >::assign(char const*, unsigned long) + 108
8 libc++.1.dylib 0x213583188 std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> >::operator=(std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > const&) + 56
9 MiSuburbano 0x101d506e4 GMVx_farmhash::Fingerprint128(char const*, unsigned long) + 4760932
10 MiSuburbano 0x102031698 GMVx_farmhash::Fingerprint128(char const*, unsigned long) + 7779608
11 MiSuburbano 0x101940d8c GMVx_farmhash::Fingerprint128(char const*, unsigned long) + 502796
12 MiSuburbano 0x101940b00 GMVx_farmhash::Fingerprint128(char const*, unsigned long) + 502144
13 MiSuburbano 0x1018c8180 +[GMVDetector detectorForClassNamed:options:] + 8192
14 MiSuburbano 0x1018c8044 +[GMVDetector detectorOfType:options:] + 7876
15 MiSuburbano 0x101423298 -[FIRVisionTextRecognizer initWithApp:logger:options:type:detector:cloudVisionService:] + 8847848
16 MiSuburbano 0x101422ca8 +[FIRVisionTextRecognizer textRecognizerForApp:logger:options:type:detector:cloudVisionService:] + 8846328
17 MiSuburbano 0x101422b48 +[FIRVisionTextRecognizer textRecognizerForApp:logger:options:type:] + 8845976
18 MiSuburbano 0x1014228b0 +[FIRVisionTextRecognizer onDeviceTextRecognizerForApp:logger:] + 8845312
19 MiSuburbano 0x1014128a8 -[FIRVision onDeviceTextRecognizer] + 8779768
20 MiSuburbano 0x100f6c260 -[VisionImagePresenter firebaseImage:response:] + 34 (VisionImagePresenter.m:34)
21 MiSuburbano 0x100f4184c -[YourBalanceViewController firebaseImage] + 184 (YourBalanceViewController.m:184)
22 UIKitCore 0x240f4d990 -[UIImagePickerController _imagePickerDidCompleteWithInfo:] + 128
23 PhotoLibrary 0x227766148 PLNotifyImagePickerOfImageAvailability
24 CameraUI 0x23202ad84 initPLNotifyImagePickerOfImageAvailability
25 CameraUI 0x23202a728 -[CAMImagePickerCameraViewController _handleCapturedImagePickerPhotoWithCropOverlayOutput:]
26 AssetsLibraryServices 0x221dcb78c __pl_dispatch_sync_block_invoke + 36
27 libdispatch.dylib 0x213e22484 _dispatch_client_callout + 16
28 libdispatch.dylib 0x213e0281c _dispatch_async_and_wait_invoke + 92
29 libdispatch.dylib 0x213e22484 _dispatch_client_callout + 16
30 libdispatch.dylib 0x213e01b34 _dispatch_main_queue_callback_4CF$VARIANT$armv81 + 1012
31 CoreFoundation 0x214379ce4 __CFRUNLOOP_IS_SERVICING_THE_MAIN_DISPATCH_QUEUE__ + 12
32 CoreFoundation 0x214374bac __CFRunLoopRun + 1964
33 CoreFoundation 0x2143740e0 CFRunLoopRunSpecific + 436
34 GraphicsServices 0x2165ed584 GSEventRunModal + 100
35 UIKitCore 0x2415c4c00 UIApplicationMain + 212
36 MiSuburbano 0x101135190 main + 33 (main.mm:33)
37 libdyld.dylib 0x213e32bb4 start + 4
firebase/quickstart-ios#1. Thread
0 libsystem_kernel.dylib 0x213f7fb9c __workq_kernreturn + 8
1 libsystem_pthread.dylib 0x2140051c0 _pthread_wqthread + 540
2 libsystem_pthread.dylib 0x214007cec start_wqthread + 4
firebase/quickstart-ios#2. Thread
0 libsystem_kernel.dylib 0x213f7fb9c __workq_kernreturn + 8
1 libsystem_pthread.dylib 0x214005100 _pthread_wqthread + 348
2 libsystem_pthread.dylib 0x214007cec start_wqthread + 4
firebase/quickstart-ios#3. com.apple.uikit.eventfetch-thread
0 libsystem_kernel.dylib 0x213f73ea4 mach_msg_trap + 8
1 libsystem_kernel.dylib 0x213f7337c mach_msg + 72
2 CoreFoundation 0x214379ad8 __CFRunLoopServiceMachPort + 236
3 CoreFoundation 0x214374974 __CFRunLoopRun + 1396
4 CoreFoundation 0x2143740e0 CFRunLoopRunSpecific + 436
5 Foundation 0x214d6a494 -[NSRunLoop(NSRunLoop) runMode:beforeDate:] + 300
6 Foundation 0x214d6a340 -[NSRunLoop(NSRunLoop) runUntilDate:] + 148
7 UIKitCore 0x2416b50c4 -[UIEventFetcher threadMain] + 136
8 Foundation 0x214e9d23c __NSThread__start__ + 1040
9 libsystem_pthread.dylib 0x21400425c _pthread_body + 128
10 libsystem_pthread.dylib 0x2140041bc _pthread_start + 48
11 libsystem_pthread.dylib 0x214007cf4 thread_start + 4
firebase/quickstart-ios#4. CCTLogWriter
0 libsystem_kernel.dylib 0x213f81948 write + 8
1 CoreFoundation 0x21433b520 fdWrite + 20
2 CoreFoundation 0x21433bbd4 fileWrite + 24
3 CoreFoundation 0x21438b42c CFWriteStreamWrite + 300
4 MiSuburbano 0x10127a150 GPBRefreshBuffer + 72 (GPBCodedOutputStream.m:72)
5 MiSuburbano 0x101906390 -[GMVx_CCTLogOutputStream writeLogEvent:] + 262672
6 MiSuburbano 0x101904e68 -[GMVx_CCTLogWriter writeLogInternal:pseudonymousID:logDirectory:clock:logTransformers:] + 257256
7 MiSuburbano 0x101904590 __80-[GMVx_CCTLogWriter writeLog:pseudonymousID:logDirectory:clock:logTransformers:]_block_invoke_2 + 254992
8 libdispatch.dylib 0x213e216c8 _dispatch_call_block_and_release + 24
9 libdispatch.dylib 0x213e22484 _dispatch_client_callout + 16
10 libdispatch.dylib 0x213dfcfa0 _dispatch_lane_serial_drain$VARIANT$armv81 + 548
11 libdispatch.dylib 0x213dfdae4 _dispatch_lane_invoke$VARIANT$armv81 + 412
12 libdispatch.dylib 0x213e05f04 _dispatch_workloop_worker_thread + 584
13 libsystem_pthread.dylib 0x2140050dc _pthread_wqthread + 312
14 libsystem_pthread.dylib 0x214007cec start_wqthread + 4
firebase/quickstart-ios#5. com.apple.root.default-qos
0 libsystem_kernel.dylib 0x213f7f9fc __ulock_wait + 8
1 libdispatch.dylib 0x213df0e20 _dispatch_ulock_wait + 56
2 libdispatch.dylib 0x213df0f58 _dispatch_thread_event_wait_slow$VARIANT$armv81 + 48
3 libdispatch.dylib 0x213e029e8 __DISPATCH_WAIT_FOR_QUEUE__ + 328
4 libdispatch.dylib 0x213e025f0 _dispatch_sync_f_slow + 144
5 AssetsLibraryServices 0x221dcb75c pl_dispatch_sync + 68
6 PhotoLibrary 0x2277993a8 -[PLCropOverlay(PhotoSaving) _backgroundSavePhoto:]
7 AssetsLibraryServices 0x221dcb82c __pl_dispatch_async_block_invoke + 36
8 libdispatch.dylib 0x213e216c8 _dispatch_call_block_and_release + 24
9 libdispatch.dylib 0x213e22484 _dispatch_client_callout + 16
10 libdispatch.dylib 0x213df8a6c _dispatch_queue_override_invoke + 664
11 libdispatch.dylib 0x213e04aec _dispatch_root_queue_drain + 344
12 libdispatch.dylib 0x213e0534c _dispatch_worker_thread2 + 116
13 libsystem_pthread.dylib 0x21400517c _pthread_wqthread + 472
14 libsystem_pthread.dylib 0x214007cec start_wqthread + 4
firebase/quickstart-ios#6. com.twitter.crashlytics.ios.MachExceptionServer
0 libsystem_kernel.dylib 0x213f73ea4 mach_msg_trap + 8
1 libsystem_kernel.dylib 0x213f7337c mach_msg + 72
2 MiSuburbano 0x1012ebd14 CLSMachExceptionServer + 7572580
3 libsystem_pthread.dylib 0x21400425c _pthread_body + 128
4 libsystem_pthread.dylib 0x2140041bc _pthread_start + 48
5 libsystem_pthread.dylib 0x214007cf4 thread_start + 4
firebase/quickstart-ios#7. Thread
0 libsystem_kernel.dylib 0x213f7fb9c __workq_kernreturn + 8
1 libsystem_pthread.dylib 0x214005100 _pthread_wqthread + 348
2 libsystem_pthread.dylib 0x214007cec start_wqthread + 4
firebase/quickstart-ios#8. Thread
0 libsystem_kernel.dylib 0x213f7fb9c __workq_kernreturn + 8
1 libsystem_pthread.dylib 0x214005100 _pthread_wqthread + 348
2 libsystem_pthread.dylib 0x214007cec start_wqthread + 4
firebase/quickstart-ios#9. GC Finalizer
0 libsystem_kernel.dylib 0x213f7ef0c __psynch_cvwait + 8
1 libsystem_pthread.dylib 0x214001410 _pthread_cond_wait$VARIANT$armv81 + 620
2 MiSuburbano 0x102b7ef7c il2cpp::os::posix::PosixWaitObject::Wait(unsigned int, bool) + 129 (PosixWaitObject.cpp:129)
3 MiSuburbano 0x102b64f34 FinalizerThread(void*) + 37 (GarbageCollector.cpp:37)
4 MiSuburbano 0x102b7b31c il2cpp::os::Thread::RunWrapper(void*) + 106 (Thread.cpp:106)
5 MiSuburbano 0x102b816f8 il2cpp::os::ThreadImpl::ThreadStartWrapper(void*) + 108 (ThreadImpl.cpp:108)
6 libsystem_pthread.dylib 0x21400425c _pthread_body + 128
7 libsystem_pthread.dylib 0x2140041bc _pthread_start + 48
8 libsystem_pthread.dylib 0x214007cf4 thread_start + 4
firebase/quickstart-ios#10. BatchDeleteObjects
0 libsystem_kernel.dylib 0x213f73ee0 semaphore_wait_trap + 8
1 libdispatch.dylib 0x213df0b64 _dispatch_sema4_wait$VARIANT$armv81 + 24
2 libdispatch.dylib 0x213df15d4 _dispatch_semaphore_wait_slow + 128
3 MiSuburbano 0x10242a114 ThreadedStreamBuffer::HandleReadOverflow(unsigned int&, unsigned int&) + 26 (TimeHelper.h:26)
4 MiSuburbano 0x102392078 BatchDeleteStep2Threaded(void*) + 243 (ThreadedStreamBuffer.h:243)
5 MiSuburbano 0x102429784 Thread::RunThreadWrapper(void*) + 44 (Thread.cpp:44)
6 libsystem_pthread.dylib 0x21400425c _pthread_body + 128
7 libsystem_pthread.dylib 0x2140041bc _pthread_start + 48
8 libsystem_pthread.dylib 0x214007cf4 thread_start + 4
firebase/quickstart-ios#11. AsyncReadManager
0 libsystem_kernel.dylib 0x213f73ee0 semaphore_wait_trap + 8
1 libdispatch.dylib 0x213df0b64 _dispatch_sema4_wait$VARIANT$armv81 + 24
2 libdispatch.dylib 0x213df15d4 _dispatch_semaphore_wait_slow + 128
3 MiSuburbano 0x10244c4c4 AsyncReadManagerThreaded::ThreadEntry() + 29 (Mutex.h:29)
4 MiSuburbano 0x10244c2e0 AsyncReadManagerThreaded::StaticThreadEntry(void*) + 57 (AsyncReadManagerThreaded.cpp:57)
5 MiSuburbano 0x102429784 Thread::RunThreadWrapper(void*) + 44 (Thread.cpp:44)
6 libsystem_pthread.dylib 0x21400425c _pthread_body + 128
7 libsystem_pthread.dylib 0x2140041bc _pthread_start + 48
8 libsystem_pthread.dylib 0x214007cf4 thread_start + 4
firebase/quickstart-ios#12. JavaScriptCore bmalloc scavenger
0 libsystem_kernel.dylib 0x213f7ef0c __psynch_cvwait + 8
1 libsystem_pthread.dylib 0x214001410 _pthread_cond_wait$VARIANT$armv81 + 620
2 libc++.1.dylib 0x21354c4d0 std::__1::condition_variable::wait(std::__1::unique_lock<std::__1::mutex>&) + 24
3 JavaScriptCore 0x21b72d648 void std::__1::condition_variable_any::wait<std::__1::unique_lock<bmalloc::Mutex> >(std::__1::unique_lock<bmalloc::Mutex>&) + 104
4 JavaScriptCore 0x21b73173c bmalloc::Scavenger::threadRunLoop() + 176
5 JavaScriptCore 0x21b730e70 bmalloc::Scavenger::Scavenger(std::__1::lock_guard<bmalloc::Mutex>&) + 10
6 JavaScriptCore 0x21b73291c std::__1::__thread_specific_ptr<std::__1::__thread_struct>::set_pointer(std::__1::__thread_struct*) + 38
7 libsystem_pthread.dylib 0x21400425c _pthread_body + 128
8 libsystem_pthread.dylib 0x2140041bc _pthread_start + 48
9 libsystem_pthread.dylib 0x214007cf4 thread_start + 4
firebase/quickstart-ios#13. WebThread
0 libsystem_kernel.dylib 0x213f73ea4 mach_msg_trap + 8
1 libsystem_kernel.dylib 0x213f7337c mach_msg + 72
2 CoreFoundation 0x214379ad8 __CFRunLoopServiceMachPort + 236
3 CoreFoundation 0x214374974 __CFRunLoopRun + 1396
4 CoreFoundation 0x2143740e0 CFRunLoopRunSpecific + 436
5 WebCore 0x21d1e53e8 RunWebThread(void*) + 592
6 libsystem_pthread.dylib 0x21400425c _pthread_body + 128
7 libsystem_pthread.dylib 0x2140041bc _pthread_start + 48
8 libsystem_pthread.dylib 0x214007cf4 thread_start + 4
firebase/quickstart-ios#14. com.apple.NSURLConnectionLoader
0 libsystem_kernel.dylib 0x213f73ea4 mach_msg_trap + 8
1 libsystem_kernel.dylib 0x213f7337c mach_msg + 72
2 CoreFoundation 0x214379ad8 __CFRunLoopServiceMachPort + 236
3 CoreFoundation 0x214374974 __CFRunLoopRun + 1396
4 CoreFoundation 0x2143740e0 CFRunLoopRunSpecific + 436
5 CFNetwork 0x21499700c -[__CoreSchedulingSetRunnable runForever] + 212
6 Foundation 0x214e9d23c __NSThread__start__ + 1040
7 libsystem_pthread.dylib 0x21400425c _pthread_body + 128
8 libsystem_pthread.dylib 0x2140041bc _pthread_start + 48
9 libsystem_pthread.dylib 0x214007cf4 thread_start + 4
firebase/quickstart-ios#15. Thread
0 libsystem_kernel.dylib 0x213f73ef8 semaphore_timedwait_trap + 8
1 libdispatch.dylib 0x213df0c4c _dispatch_sema4_timedwait$VARIANT$armv81 + 64
2 libdispatch.dylib 0x213df159c _dispatch_semaphore_wait_slow + 72
3 libdispatch.dylib 0x213e048dc _dispatch_worker_thread + 344
4 libsystem_pthread.dylib 0x21400425c _pthread_body + 128
5 libsystem_pthread.dylib 0x2140041bc _pthread_start + 48
6 libsystem_pthread.dylib 0x214007cf4 thread_start + 4
firebase/quickstart-ios#16. Thread
0 libsystem_kernel.dylib 0x213f73ef8 semaphore_timedwait_trap + 8
1 libdispatch.dylib 0x213df0c4c _dispatch_sema4_timedwait$VARIANT$armv81 + 64
2 libdispatch.dylib 0x213df159c _dispatch_semaphore_wait_slow + 72
3 libdispatch.dylib 0x213e048dc _dispatch_worker_thread + 344
4 libsystem_pthread.dylib 0x21400425c _pthread_body + 128
5 libsystem_pthread.dylib 0x2140041bc _pthread_start + 48
6 libsystem_pthread.dylib 0x214007cf4 thread_start + 4
firebase/quickstart-ios#17. AVAudioSession Notify Thread
0 libsystem_kernel.dylib 0x213f73ea4 mach_msg_trap + 8
1 libsystem_kernel.dylib 0x213f7337c mach_msg + 72
2 CoreFoundation 0x214379ad8 __CFRunLoopServiceMachPort + 236
3 CoreFoundation 0x214374974 __CFRunLoopRun + 1396
4 CoreFoundation 0x2143740e0 CFRunLoopRunSpecific + 436
5 AVFAudio 0x21a38860c GenericRunLoopThread::Entry(void*) + 164
6 AVFAudio 0x21a3b4768 CAPThread::Entry(CAPThread*) + 88
7 libsystem_pthread.dylib 0x21400425c _pthread_body + 128
8 libsystem_pthread.dylib 0x2140041bc _pthread_start + 48
9 libsystem_pthread.dylib 0x214007cf4 thread_start + 4
Arcore facial recognition tracking, flashback
Process: com.google.ar.sceneform.samples.augmentedfaces, PID: 11199
java.lang.IllegalArgumentException: Invalid image data size.
at com.google.android.gms.vision.Frame$Builder.setImageData(Unknown Source:13)
at com.google.firebase.ml.vision.common.FirebaseVisionImage.zza(Unknown Source:137)
at com.google.firebase.ml.vision.face.FirebaseVisionFaceDetector.detectInImage(Unknown Source:26)
at com.google.ar.sceneform.samples.augmentedfaces.AugmentedFacesActivity.lambda$onCreate$1(AugmentedFacesActivity.java:180)
Be able to identify, do not crash back
Clone googlesamples/mlkit/ios
and install pods
.
At the time of these tests, project is using GoogleMLKit/BarcodeScanning (0.60.0)
It is being run using Xcode 11.5
on macOS Catalina 10.15.4
.
Actual: It does not recognize this DataMatrix
code.
Expected: It recognizes this DataMatrix
code. The code has valid data -
HP ProBook 440 G4,Z1Z83UT#ABA,5CD712BZVJ,1y1y0y
Use an online decoder service - https://online-barcode-reader.inliteresearch.com/
ML Kit for Firebase
sample as well.Preview.app
on macOS
to no avail.I am using NextLevel for Session Capture - https://github.com/NextLevel/NextLevel
I am using Firebase Vision Face Detection to find the Head Z angle (headEulerAngleZ).
on the callback
func nextLevel(_ nextLevel: NextLevel, didAppendVideoSampleBuffer sampleBuffer: CMSampleBuffer, inSession session: NextLevelSession) {
I am using
let visionImage = VisionImage(buffer: sampleBuffer)
From the code of nextLabel it is clear that the callback is made on the main thread -
DispatchQueue.main.async {
self.videoDelegate?.nextLevel(self, didAppendVideoSampleBuffer: sampleBuffer, inSession: session)
}
On using the same Main Async Callback App crashed with an error message -
Thread 1: Exception: "This method must be called on a background thread."
But if I do process operation on Background
DispatchQueue.global(qos: .background).sync { }
Again the code crashed at code -
detectedFaces = try faceDetector.results(in: image)
var detectedFaces: [VisionFace]? = nil
do {
detectedFaces = try faceDetector.results(in: image)
} catch let error {
print("Failed to detect faces with error: \(error.localizedDescription).")
}
Question - Which is the correct way of calling the
Vision Face Detected
method on a background thread?
MLVisionExample is not detecting Delaware and North Carolina state drivers license barcode. I have not checked other state drivers license barcode.
Tried by providing the UIImagePickerControllerOriginalImage image but success ratio is less than 10% in iPhone 7 plus device. Its working fine on iPhone X. Not tested on other devices.
Hi, I have a problem with InceptionV3 tflite model executed on MLKit SDK. I'm not using the ILSVRC Dataset for reasons of storage space and execution time. I'm using the Caltech Dataset , find it here. I have also remove categories that Inception V3 is not able to recognize. I'm doing a comparison of the new Mobile Machine Learning SDKs, and i try to use the new Snapdragon Neural Processing Engine (SNPE) and MLKit. I ran inceptionV3 on both SDKs, and with SNPE I have reached the 82% of correct classifications, and with MLKit the 47%. For MlKit, I have download the tflite float model from here.
I build the network with the following code:
long startBuild = SystemClock.elapsedRealtime();
FirebaseLocalModelSource localSource =
new FirebaseLocalModelSource.Builder("inception_v3") // Assign a name for this model
.setAssetFilePath("inception_v3.tflite")
.build();
FirebaseModelManager.getInstance().registerLocalModelSource(localSource);
FirebaseModelOptions options = new FirebaseModelOptions.Builder()
.setLocalModelName("inception_v3")
.build();
FirebaseModelInterpreter firebaseInterpreter =
FirebaseModelInterpreter.getInstance(options);
FirebaseModelInputOutputOptions inputOutputOptions =
new FirebaseModelInputOutputOptions.Builder()
.setInputFormat(0, FirebaseModelDataType.FLOAT32, new int[]{1, 299, 299, 3})
.setOutputFormat(0, FirebaseModelDataType.FLOAT32, new int[]{1, 1001})
.build();
long endBuild = SystemClock.elapsedRealtime();
And then I preprocess image:
resized_image = Bitmap.createScaledBitmap(image, 299,299, false);
input = new float[1][299][299][3];
for (int x = 0; x < 299; x++) {
for (int y = 0; y < 299; y++) {
int pixel = resized_image.getPixel(x, y);
// Normalize channel values to [0.0, 1.0]. This requirement varies by
// model. For example, some models might require values to be normalized
// to the range [-1.0, 1.0] instead.
float b = ((pixel) & 0xFF);
float g = ((pixel >> 8) & 0xFF);
float r = ((pixel >> 16) & 0xFF);
input[batchNum][x][y][0] = (r - 127) / 128.0f;
input[batchNum][x][y][1] = (g - 127) / 128.0f;
input[batchNum][x][y][2] = (b - 127) / 128.0f;
//input[batchNum][x][y][0] = (Color.red(pixel) - 128) / 128.0f;
//input[batchNum][x][y][1] = (Color.green(pixel) - 128) / 128.0f;
//input[batchNum][x][y][2] = (Color.blue(pixel) - 128) / 128.0f;
}
}
inputs = new FirebaseModelInputs.Builder()
.add(input) // add() as many input arrays as your model requires
.build();
Task<FirebaseModelOutputs> task = firebaseInterpreter.run(inputs, inputOutputOptions);
Hi,
I really like these samples and was wondering if there are plans to update android/material-showcase to use CameraX + PreviewView.
If not, can you explain the technical reason(s) why this cannot be done?
I noticed there is this PR - #14 to update the translate-showcase sample to use CameraX + Preview.
Thanks!
I'm reading a buffer like such:
let visionImage = VisionImage(buffer: sampleBuffer)
barcodeDetector.detect(in: visionImage) { (barcodes, error) in
...
}
I'm seeing errors of random characters being exchanged in code128/39 barcodes. No errors are raised. As you can see I'm checking enterprise labels. The 2D data matrix codes scan fine. However, their 1d counterparts randomly fail.
I haven't experienced this bug personally but I keep seeing this crash (about 10 crashes for 2 users in the last month) appearing on Crashalytics.
Firebase version: 5.20.2
Podfile.lock:
- Firebase/MLVision (5.20.2):
- Firebase/CoreOnly
- FirebaseMLVision (= 0.15.0)
- Firebase/MLVisionTextModel (5.20.2):
- Firebase/CoreOnly
- FirebaseMLVisionTextModel (= 0.15.0)
Device:
Model: iPhone XS Max
Orientation: Portrait
RAM free: 477.13 MB
Disk free: 65.87 GB
Version: 12.3.1 (16F203)
Orientation: Portrait
Jailbroken: No
Stack Trace:
EXC_BREAKPOINT 0x00000001ba81f748
Crashed: com.google.firebaseml.textrecognition
0 libsystem_malloc.dylib 0x1ba81f748 nanov2_allocate_from_block + 580
1 ??? 0x443d01ba81e86c (Missing)
2 ??? 0x180f01ba81e7a0 (Missing)
3 ??? 0x664e81ba824ea4 (Missing)
4 ??? 0x4a8b01ba822b68 (Missing)
5 ??? 0x718001ba8235ec (Missing)
6 ??? 0x14e081b9e37570 (Missing)
7 ??? 0x38808103498540 (Missing)
8 MyApp 0x10346eb08 ocr::photo::jni_helper::RecognizeRawDataWithBoxAndAssistAndDetections(unsigned char const*, int, int, int, int, int, int, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > const&, int, int, int, int, std::__1::vector<ocr::photo::TextBox, std::__1::allocator<ocr::photo::TextBox> > const&, std::__1::vector<ocr::photo::DetectionBox, std::__1::allocator<ocr::photo::DetectionBox> > const&, ocr::photo::QueryOptions const&, ocr::photo::QueryMetaResults*, std::__1::vector<ocr::photo::LineBox, std::__1::allocator<ocr::photo::LineBox> >*) + 4375816968
9 MyApp 0x103063dd8 -[GMVTextDetector textsInImageBufferData:colorModel:startDate:width:height:options:orientation:format:] + 4371578328
10 MyApp 0x102ef93a4 __51-[FIRVisionTextRecognizer processImage:completion:]_block_invoke + 4370092964
11 libdispatch.dylib 0x1ba658304 _dispatch_call_block_and_release + 32
12 ??? 0x5c1301ba659884 (Missing)
13 ??? 0x17d01ba660dd4 (Missing)
14 ??? 0x6cf781ba661918 (Missing)
15 ??? 0xb5101ba669cc0 (Missing)
16 ??? 0x8a701ba854a98 (Missing)
17 ??? 0x249981ba85adc4 (Missing)
com.apple.main-thread
com.apple.main-thread
0 libsystem_kernel.dylib 0x1ba7d1c60 mach_msg_trap + 8
1 libsystem_kernel.dylib 0x1ba7d10e8 mach_msg + 76
2 ??? 0xc0c81babd9e10 (Missing)
3 ??? 0x279c81babd4ab4 (Missing)
4 ??? 0x5dbf81babd4254 (Missing)
5 ??? 0x47f401bce13d8c (Missing)
6 ??? 0x6f8101e7f1c4c0 (Missing)
7 ??? 0x7dcf8102c452dc (Missing)
8 libdyld.dylib 0x1ba690fd8 start + 4
Thread firebase/quickstart-ios#1
Thread
0 libsystem_kernel.dylib 0x1ba7ddb64 __workq_kernreturn + 8
1 libsystem_pthread.dylib 0x1ba854abc _pthread_wqthread + 344
2 ??? 0x1ac681ba85adc4 (Missing)
Thread firebase/quickstart-ios#2
Thread
0 libsystem_kernel.dylib 0x1ba7ddb64 __workq_kernreturn + 8
1 libsystem_pthread.dylib 0x1ba854abc _pthread_wqthread + 344
2 ??? 0x113c81ba85adc4 (Missing)
Thread firebase/quickstart-ios#3
Thread
0 libsystem_kernel.dylib 0x1ba7ddb64 __workq_kernreturn + 8
1 libsystem_pthread.dylib 0x1ba854b7c _pthread_wqthread + 536
2 ??? 0x58e081ba85adc4 (Missing)
com.apple.uikit.eventfetch-thread
com.apple.uikit.eventfetch-thread
0 libsystem_kernel.dylib 0x1ba7d1c60 mach_msg_trap + 8
1 libsystem_kernel.dylib 0x1ba7d10e8 mach_msg + 76
2 ??? 0x4a2d81babd9e10 (Missing)
3 ??? 0x1a8c01babd4ab4 (Missing)
4 ??? 0x619081babd4254 (Missing)
5 ??? 0x1d4c81bb5b404c (Missing)
6 ??? 0x40e581bb5b3ed4 (Missing)
7 ??? 0x32b01e80080d4 (Missing)
8 ??? 0x198301bb5b2c4c (Missing)
9 ??? 0x646901bb6e8e54 (Missing)
10 ??? 0x1da481ba852908 (Missing)
11 ??? 0x47b781ba852864 (Missing)
12 ??? 0x1b1301ba85adcc (Missing)
Thread firebase/quickstart-ios#4
Thread
0 libsystem_kernel.dylib 0x1ba7ddb64 __workq_kernreturn + 8
1 libsystem_pthread.dylib 0x1ba854b7c _pthread_wqthread + 536
2 ??? 0x7ba181ba85adc4 (Missing)
Thread firebase/quickstart-ios#5
Thread
0 libsystem_kernel.dylib 0x1ba7ddb64 __workq_kernreturn + 8
1 libsystem_pthread.dylib 0x1ba854abc _pthread_wqthread + 344
2 ??? 0x666001ba85adc4 (Missing)
Thread firebase/quickstart-ios#6
Thread
0 libsystem_kernel.dylib 0x1ba7ddb64 __workq_kernreturn + 8
1 libsystem_pthread.dylib 0x1ba854b7c _pthread_wqthread + 536
2 ??? 0x5e5b01ba85adc4 (Missing)
com.apple.mobileslideshow.accessCallbacks
com.apple.mobileslideshow.accessCallbacks
0 libsystem_kernel.dylib 0x1ba7d1c9c semaphore_wait_trap + 8
1 libdispatch.dylib 0x1ba659f6c _dispatch_sema4_wait + 28
2 ??? 0x192e81ba65aa24 (Missing)
3 ??? 0x56c30102cbd3f8 (Missing)
4 RealmSwift 0x1049f3f5c Realm.write(_:) (Realm.swift:155)
5 MyApp 0x102cbb048 specialized closure firebase/quickstart-ios#4 in closure firebase/quickstart-ios#1 in ScanControllerObject.scanPhotos(assetArray:) (ScanControllerObject.swift:395)
6 MyApp 0x102cc1300 partial apply for specialized (<compiler-generated>)
7 MyApp 0x102cca8e8 thunk for @escaping @callee_guaranteed (@guaranteed UIImage?, @guaranteed [AnyHashable : Any]?) -> () (<compiler-generated>)
8 Photos 0x1c99367ec __84-[PHImageManager requestImageForAsset:targetSize:contentMode:options:resultHandler:]_block_invoke_3 + 328
9 ??? 0x4c9701c992b10c (Missing)
10 ??? 0x73c201c992fab4 (Missing)
11 ??? 0x7d8e01c992e0b8 (Missing)
12 ??? 0x2c3b81c9931324 (Missing)
13 ??? 0x5d6701c9936308 (Missing)
14 ??? 0x4c1501c9939a00 (Missing)
15 ??? 0x44758102cb9d88 (Missing)
16 MyApp 0x102ccadac thunk for @escaping @callee_guaranteed (@guaranteed PHAsset, @unowned Int, @unowned UnsafeMutablePointer<ObjCBool>) -> () (<compiler-generated>)
17 CoreFoundation 0x1bac28d24 __NSArrayEnumerate + 420
18 ??? 0x6dc801c9a22bd0 (Missing)
19 ??? 0x21d48102cb8d30 (Missing)
20 MyApp 0x102cb84d0 closure firebase/quickstart-ios#1 in ScanControllerObject.setupObject() (PhotoLibraryManager.swift:17)
21 MyApp 0x102cf28bc thunk for @escaping @callee_guaranteed (@unowned PHAuthorizationStatus) -> () (<compiler-generated>)
22 Photos 0x1c995400c __39+[PHPhotoLibrary requestAuthorization:]_block_invoke + 88
23 ??? 0x766e81c86ccc84 (Missing)
24 ??? 0x7a981c86b3490 (Missing)
25 ??? 0x47c01ba658304 (Missing)
26 ??? 0x344a81ba659884 (Missing)
27 ??? 0x370781ba660dd4 (Missing)
28 ??? 0x1a4501ba661918 (Missing)
29 ??? 0x570781ba669cc0 (Missing)
30 ??? 0xb6d81ba854a98 (Missing)
31 ??? 0x2b4001ba85adc4 (Missing)
Thread firebase/quickstart-ios#7
Thread
0 libsystem_kernel.dylib 0x1ba7ddb64 __workq_kernreturn + 8
1 libsystem_pthread.dylib 0x1ba854abc _pthread_wqthread + 344
2 ??? 0x7b1d01ba85adc4 (Missing)
com.twitter.crashlytics.ios.MachExceptionServer
com.twitter.crashlytics.ios.MachExceptionServer
0 MyApp 0x102d20bcc CLSProcessRecordAllThreads (CLSProcess.c:376)
1 MyApp 0x102d20fb4 CLSProcessRecordAllThreads (CLSProcess.c:407)
2 MyApp 0x102d10938 CLSHandler (CLSHandler.m:26)
3 MyApp 0x102d0bd28 CLSMachExceptionServer (CLSMachException.c:446)
4 libsystem_pthread.dylib 0x1ba852908 _pthread_body + 132
5 ??? 0x332d01ba852864 (Missing)
6 ??? 0x507101ba85adcc (Missing)
Thread firebase/quickstart-ios#8
Thread
0 libsystem_kernel.dylib 0x1ba7ddb64 __workq_kernreturn + 8
1 libsystem_pthread.dylib 0x1ba854abc _pthread_wqthread + 344
2 ??? 0x11fd01ba85adc4 (Missing)
Thread firebase/quickstart-ios#9
Thread
0 libsystem_pthread.dylib 0x1ba85adc0 start_wqthread + 254
com.apple.NSURLConnectionLoader
com.apple.NSURLConnectionLoader
0 libsystem_kernel.dylib 0x1ba7d1c60 mach_msg_trap + 8
1 libsystem_kernel.dylib 0x1ba7d10e8 mach_msg + 76
2 ??? 0x178a01babd9e10 (Missing)
3 ??? 0xac401babd4ab4 (Missing)
4 ??? 0x4ad01babd4254 (Missing)
5 ??? 0xf5f81bb1f5c88 (Missing)
6 ??? 0xeea01bb5b2c4c (Missing)
7 ??? 0x4a6f81bb6e8e54 (Missing)
8 ??? 0x6cc381ba852908 (Missing)
9 ??? 0x74781ba852864 (Missing)
10 ??? 0x3db781ba85adcc (Missing)
H11ANEServicesThread
H11ANEServicesThread
0 libsystem_kernel.dylib 0x1ba7d1c60 mach_msg_trap + 8
1 libsystem_kernel.dylib 0x1ba7d10e8 mach_msg + 76
2 ??? 0x649581babd9e10 (Missing)
3 ??? 0x796d01babd4ab4 (Missing)
4 ??? 0x16d81babd4254 (Missing)
5 ??? 0x551b81babd4f88 (Missing)
6 ??? 0x3d5101e4210a04 (Missing)
7 ??? 0x2c701ba852908 (Missing)
8 ??? 0x224f81ba852864 (Missing)
9 ??? 0x2f3c01ba85adcc (Missing)
H11ANEServicesThread
H11ANEServicesThread
0 libsystem_kernel.dylib 0x1ba7d1c60 mach_msg_trap + 8
1 libsystem_kernel.dylib 0x1ba7d10e8 mach_msg + 76
2 ??? 0x196f01babd9e10 (Missing)
3 ??? 0x4a6001babd4ab4 (Missing)
4 ??? 0x605a81babd4254 (Missing)
5 ??? 0x3e5981babd4f88 (Missing)
6 ??? 0x4c6881e4210a04 (Missing)
7 ??? 0x4c7501ba852908 (Missing)
8 ??? 0x613d01ba852864 (Missing)
9 ??? 0x33a581ba85adcc (Missing)
Thread firebase/quickstart-ios#10
Thread
0 libsystem_kernel.dylib 0x1ba7defc8 poll + 8
1 MyApp 0x1037db04c (anonymous namespace)::ExitTimeoutWatcher(void*) + 4379406412
2 libsystem_pthread.dylib 0x1ba852908 _pthread_body + 132
3 ??? 0x62b801ba852864 (Missing)
4 ??? 0x528c01ba85adcc (Missing)
Thread firebase/quickstart-ios#11
Thread
0 libsystem_kernel.dylib 0x1ba7defc8 poll + 8
1 MyApp 0x1037daf2c (anonymous namespace)::ThreadLivenessWatcher(void*) + 4379406124
2 libsystem_pthread.dylib 0x1ba852908 _pthread_body + 132
3 ??? 0x6fb601ba852864 (Missing)
4 ??? 0x2c6701ba85adcc (Missing)
RLMRealm notification listener
RLMRealm notification listener
0 libsystem_kernel.dylib 0x1ba7de8e4 kevent + 8
1 Realm 0x10453f588 realm::_impl::ExternalCommitHelper::listen() (external_commit_helper.cpp:217)
2 Realm 0x1045400d4 void* std::__1::__thread_proxy<std::__1::tuple<std::__1::unique_ptr<std::__1::__thread_struct, std::__1::default_delete<std::__1::__thread_struct> >, realm::_impl::ExternalCommitHelper::ExternalCommitHelper(realm::_impl::RealmCoordinator&)::$_0> >(void*) (tuple:170)
3 libsystem_pthread.dylib 0x1ba852908 _pthread_body + 132
4 ??? 0x63ee01ba852864 (Missing)
5 ??? 0x376901ba85adcc (Missing)
Any idea how to fix this? Help is greatly appreciated as I've been getting complains from users.
I am not able to scan PDF417 barcode on south african driving license (attached below) . It dosent scan anything. I am using default mlkit project of github. I have added Firebase to the project.
Following are dependencies used.
// ML Kit dependencies
implementation 'com.google.firebase:firebase-ml-vision:19.0.2'
implementation 'com.google.firebase:firebase-ml-vision-image-label-model:17.0.2'
implementation 'com.google.firebase:firebase-ml-vision-face-model:17.0.2'
implementation 'com.google.firebase:firebase-ml-model-interpreter:17.0.3'
There are constant recurring logs.
2019-02-28 21:10:34.172 3508-3746/com.google.firebase.samples.apps.mlkit D/MIDemoApp:CameraSource: Process an image
2019-02-28 21:10:34.172 3508-3746/com.google.firebase.samples.apps.mlkit D/skia: onFlyCompress
Scan the PDF417 barcode and show appropriate result in context to it. I scanned the same driving license via. Microblink app and it scans perfectly.
Used the mlkit code on github.
Hello!
Background
I am using Firebase text recognition to recognise text in images. However I have the following problem:
As you can see in the image if the target image is slightly rotated then the frames are being overlapped. What I would like to achieve is for the blue rectangles to follow the text rather than having a horizontal shape. Currently I do drawing in the following manner, inside of my processText(...)
method:
// Paragraphs
for block in text.blocks {
// Lines
for line in block.lines {
drawFrame(line.frame, transform: transform)
}
The drawFrame(...)
method does the following:
private func drawFrame(_ frame: CGRect, transform: CGAffineTransform) {
let transformedRect = frame.applying(transform)
UIUtilities.addRectangle(transformedRect, to: self.annotationOverlayView)
}
So, once the line
element is transformed I do the drawing:
public static func addRectangle(_ rectangle: CGRect, to view: UIView){
let rectangleView = UIView(frame: rectangle)
rectangleView.layer.borderColor = UIColor(hex: Constants.colorDarkBlue).cgColor
rectangleView.layer.borderWidth = Constants.lineWidth
rectangleView.bounds = rectangleView.frame.insetBy(dx:-3, dy: -3)
view.addSubview(rectangleView)
}
What did I try?
I tried looking at the addShape(...)
method in this repository: https://github.com/firebase/quickstart-ios/blob/5b752734233d4be625ece2747385d4974f2fc07f/mlvision/MLVisionExample/UIUtilities.swift#L52
I am 100% confident that `addShape(...) is the method to use but I I can see in the example during the live run example it is capable of adjusting the shape of the highlight regardless of how much I rotate the image.
So I tried to do the following:
This is inside my processResults(...)
method. The captureImageView
is the view in which I put the captured image on and then do processing on it.
// Paragraphs
for block in text.blocks {
// Lines
for line in block.lines {
let points = self.convertedPoints(from: line.cornerPoints, width: capturedImageView.bounds.width, height: capturedImageView.bounds.height)
print("Points: \(points)")
UIUtilities.addShape(
withPoints: points,
to: self.annotationOverlayView,
color: UIColor.orange)
Similar to the Repository I link above, I copied over the addShape(..)
into my class. I also coppied convertedPoints
and normalizedPoint
You can see in my convertedPoints(...)
I am not actually using the normalizedPoint constant (as I am not sure what do there).
private func convertedPoints(
from points: [NSValue]?,
width: CGFloat,
height: CGFloat
) -> [NSValue]? {
return points?.map {
let cgPointValue = $0.cgPointValue
let normalizedPoint = CGPoint(x: cgPointValue.x / width, y: cgPointValue.y / height)
//let cgPoint = previewLayer.layerPointConverted(fromCaptureDevicePoint: normalizedPoint)
let cgPoint = capturedImageView.frame.origin
let value = NSValue(cgPoint: cgPoint)
return value
}
}
And this is my normalizePoint(...)
method
private func normalizedPoint(
fromVisionPoint point: VisionPoint,
width: CGFloat,
height: CGFloat
) -> CGPoint {
let cgPoint = CGPoint(x: CGFloat(point.x.floatValue), y: CGFloat(point.y.floatValue))
var normalizedPoint = CGPoint(x: cgPoint.x / width, y: cgPoint.y / height)
normalizedPoint = capturedImageView.frame.origin
return normalizedPoint
}
To Lazy to Read
I would like to adjust the shape of the highlight to adapt to slightly rotated images. Similar to the example mlvision
application (With live AVFoundation camera run) in this repository.
I have been playing around with these samples using an Emulator on API 29 (Pixel 2).
I noticed with the https://github.com/googlesamples/mlkit/tree/master/android/vision-quickstart
The Java version works fine without issues, however, the Kotlin version I run into issues.
On the Kotlin version, I am unable to move the emulator camera around and it does not seem to be detecting any objects. In the video below, you'll notice the message below the emulator keeps showing/dismissing non-stop.
Here's a video: https://youtu.be/jzaUEeOZT2Q
This is reproducible by myself and others on my team.
Let me know if there is any other information needed.
This should be fixed 675a2e5#diff-06b93ae9f3a30417bb2b7188c05a82c5
Feel free to reopen if the issue persists.
Not sure if it is the same issue, but I've been seeing a very similar crash. Please advise.
The crash reports come from iPhone 6s, 7, 7 Plus, 8, 8 Plus, and X. They're all on iOS 12. On average with free RAM of 103.36 MB. 0% in background. 0% jailbroken.
Podfile.lock:
- Firebase/MLVision (5.20.2):
- Firebase/CoreOnly
- FirebaseMLVision (= 0.15.0)
- Firebase/MLVisionTextModel (5.20.2):
- Firebase/CoreOnly
- FirebaseMLVisionTextModel (= 0.15.0)
Device:
Model: iPhone X
Orientation: Face Up
RAM free: 85.31 MB
Disk free: 47.43 GB
Version: 12.3.1 (16F203)
Orientation: Portrait
Jailbroken: No
Keys:
crash_info_entry_0: abort() called
Stack Trace:
Crashed: Thread firebase/quickstart-ios#1
SIGABRT ABORT 0x000000018b3d60dc
Crashed: Thread
0 libsystem_kernel.dylib 0x18b3d60dc __pthread_kill + 8
1 libsystem_pthread.dylib 0x18b4539b0 pthread_kill$VARIANT$armv81 + 296
2 libsystem_c.dylib 0x18b32fea8 abort + 140
3 MyApp 0x10537d498 PythonGilHolderLookup::Register(long long (*)()) + 4344779928
4 MyApp 0x10537ef48 (anonymous namespace)::ThreadLivenessWatcher(void*) + 4344786760
5 libsystem_pthread.dylib 0x18b4582c0 _pthread_body + 128
6 libsystem_pthread.dylib 0x18b458220 _pthread_start + 44
7 libsystem_pthread.dylib 0x18b45bcdc thread_start + 4
com.apple.main-thread
com.apple.main-thread
0 libsystem_kernel.dylib 0x18b3cb0f4 mach_msg_trap + 8
1 libsystem_kernel.dylib 0x18b3ca5a0 mach_msg + 72
2 CoreFoundation 0x18b7caa10 __CFRunLoopServiceMachPort + 236
3 CoreFoundation 0x18b7c5920 __CFRunLoopRun + 1360
4 CoreFoundation 0x18b7c50b0 CFRunLoopRunSpecific + 436
5 GraphicsServices 0x18d9c579c GSEventRunModal + 104
6 UIKitCore 0x1b8031978 UIApplicationMain + 212
7 MyApp 0x1047e92dc main (MyAppnizeScanViewController.swift:18)
8 libdyld.dylib 0x18b28a8e0 start + 4
com.apple.uikit.eventfetch-thread
com.apple.uikit.eventfetch-thread
0 libsystem_kernel.dylib 0x18b3cb0f4 mach_msg_trap + 8
1 libsystem_kernel.dylib 0x18b3ca5a0 mach_msg + 72
2 CoreFoundation 0x18b7caa10 __CFRunLoopServiceMachPort + 236
3 CoreFoundation 0x18b7c5920 __CFRunLoopRun + 1360
4 CoreFoundation 0x18b7c50b0 CFRunLoopRunSpecific + 436
5 Foundation 0x18c192fac -[NSRunLoop(NSRunLoop) runMode:beforeDate:] + 300
6 Foundation 0x18c192e3c -[NSRunLoop(NSRunLoop) runUntilDate:] + 96
7 UIKitCore 0x1b8117494 -[UIEventFetcher threadMain] + 136
8 Foundation 0x18c2bf6a4 __NSThread__start__ + 984
9 libsystem_pthread.dylib 0x18b4582c0 _pthread_body + 128
10 libsystem_pthread.dylib 0x18b458220 _pthread_start + 44
11 libsystem_pthread.dylib 0x18b45bcdc thread_start + 4
Thread firebase/quickstart-ios#2
Thread
0 libsystem_kernel.dylib 0x18b3d6b74 __workq_kernreturn + 8
1 libsystem_pthread.dylib 0x18b4591f8 _pthread_wqthread + 532
2 libsystem_pthread.dylib 0x18b45bcd4 start_wqthread + 4
Thread firebase/quickstart-ios#3
Thread
0 libsystem_kernel.dylib 0x18b3d6b74 __workq_kernreturn + 8
1 libsystem_pthread.dylib 0x18b4591f8 _pthread_wqthread + 532
2 libsystem_pthread.dylib 0x18b45bcd4 start_wqthread + 4
com.twitter.crashlytics.ios.MachExceptionServer
com.twitter.crashlytics.ios.MachExceptionServer
0 libsystem_kernel.dylib 0x18b3cb0f4 mach_msg_trap + 8
1 libsystem_kernel.dylib 0x18b3ca5a0 mach_msg + 72
2 MyApp 0x1048af90c CLSMachExceptionServer (CLSMachException.c:180)
3 libsystem_pthread.dylib 0x18b4582c0 _pthread_body + 128
4 libsystem_pthread.dylib 0x18b458220 _pthread_start + 44
5 libsystem_pthread.dylib 0x18b45bcdc thread_start + 4
com.apple.CoreMLBatchProcessingQueue
com.apple.CoreMLBatchProcessingQueue
0 libsystem_kernel.dylib 0x18b3d5ee4 __psynch_cvwait + 8
1 libsystem_pthread.dylib 0x18b4554a4 _pthread_cond_wait$VARIANT$armv81 + 628
2 Metal 0x18d97a668 -[_MTLCommandBuffer waitUntilCompleted] + 80
3 Espresso 0x199f9fbd0 Espresso::batch_metal::sync_wait() + 100
4 Espresso 0x199eccbb8 Espresso::abstract_context::compute_batch_sync(void (std::__1::shared_ptr<Espresso::abstract_batch> const&) block_pointer) + 116
5 Espresso 0x199e722d4 EspressoLight::espresso_plan::execute_sync() + 308
6 Espresso 0x199e750e4 espresso_plan_execute_sync + 72
7 CoreML 0x19a2fca74 __53-[MLNeuralNetworkEngine evaluateNoAutoRelease:error:]_block_invoke + 200
8 libdispatch.dylib 0x18b2797d4 _dispatch_client_callout + 16
9 libdispatch.dylib 0x18b25a5fc _dispatch_lane_barrier_sync_invoke_and_complete + 56
10 CoreML 0x19a2fc84c -[MLNeuralNetworkEngine evaluateNoAutoRelease:error:] + 312
11 CoreML 0x19a2f9e2c -[MLNeuralNetworkEngine evaluate:error:] + 116
12 CoreML 0x19a3030d8 __62-[MLNeuralNetworkEngine predictionFromFeatures:options:error:]_block_invoke + 80
13 libdispatch.dylib 0x18b2797d4 _dispatch_client_callout + 16
14 libdispatch.dylib 0x18b25a5fc _dispatch_lane_barrier_sync_invoke_and_complete + 56
15 CoreML 0x19a302f9c -[MLNeuralNetworkEngine predictionFromFeatures:options:error:] + 284
16 MyApp 0x1048a9e94 Inceptionv3.prediction(input:) (Inceptionv3.swift:141)
17 MyApp 0x104861754 closure firebase/quickstart-ios#1 in ScanControllerObject.detectPhoto(image:asset:scanText:scanObject:scanNude:updateObject:) (Inceptionv3.swift:154)
18 RealmSwift 0x106597f5c Realm.write(_:) (Realm.swift:155)
19 MyApp 0x10485f048 specialized closure firebase/quickstart-ios#4 in closure firebase/quickstart-ios#1 in ScanControllerObject.scanPhotos(assetArray:) (ScanControllerObject.swift:395)
20 MyApp 0x104865300 partial apply for specialized (<compiler-generated>)
21 MyApp 0x10486e8e8 thunk for @escaping @callee_guaranteed (@guaranteed UIImage?, @guaranteed [AnyHashable : Any]?) -> () (<compiler-generated>)
22 Photos 0x19a197864 __84-[PHImageManager requestImageForAsset:targetSize:contentMode:options:resultHandler:]_block_invoke_3 + 320
23 Photos 0x19a18c7fc -[PHCoreImageManager _fetchPreheatableAnySizeImageAsNon5551BytesWithRequest:library:format:bestFormat:preheatItem:optimalSourcePixelSize:sync:fireAndForgetCPLDownload:completionHandler:] + 2060
24 Photos 0x19a190f8c -[PHCoreImageManager _handleNormalImageRequest:library:deliveryMode:degradedFormat:bestFormat:optimalSourcePixelSize:wantsImageDataOrURL:sync:isFinalStageOfStagedRequest:isResponseToSharedStreamsDownload:] + 2068
25 Photos 0x19a18f618 -[PHCoreImageManager _processImageRequest:sync:] + 2700
26 Photos 0x19a1926a4 -[PHImageManager requestSynchronousImageForAsset:targetSize:contentMode:options:completionHandler:] + 520
27 Photos 0x19a1973ac -[PHImageManager requestImageForAsset:targetSize:contentMode:options:resultHandler:] + 620
28 Photos 0x19a19a8c4 -[PHCachingImageManager requestImageForAsset:targetSize:contentMode:options:resultHandler:] + 308
29 MyApp 0x10485dd88 closure firebase/quickstart-ios#1 in ScanControllerObject.scanPhotos(assetArray:) (ScanControllerObject.swift:271)
30 MyApp 0x10486edac thunk for @escaping @callee_guaranteed (@guaranteed PHAsset, @unowned Int, @unowned UnsafeMutablePointer<ObjCBool>) -> () (<compiler-generated>)
31 CoreFoundation 0x18b818a40 __NSArrayEnumerate + 412
32 Photos 0x19a27d384 -[PHFetchResult enumerateObjectsUsingBlock:] + 80
33 MyApp 0x10485cd30 ScanControllerObject.scanPhotos(assetArray:) (ScanControllerObject.swift:180)
34 MyApp 0x10485c4d0 closure firebase/quickstart-ios#1 in ScanControllerObject.setupObject() (PhotoLibraryManager.swift:17)
35 MyApp 0x1048968bc thunk for @escaping @callee_guaranteed (@unowned PHAuthorizationStatus) -> () (<compiler-generated>)
36 Photos 0x19a1b4450 __39+[PHPhotoLibrary requestAuthorization:]_block_invoke + 80
37 AssetsLibraryServices 0x198f188f0 __79-[PLPrivacy _isPhotosAccessAllowedWithScope:forceHandler:accessAllowedHandler:]_block_invoke_2 + 508
38 AssetsLibraryServices 0x198f000e0 __pl_dispatch_async_block_invoke + 36
39 libdispatch.dylib 0x18b278a38 _dispatch_call_block_and_release + 24
40 libdispatch.dylib 0x18b2797d4 _dispatch_client_callout + 16
41 libdispatch.dylib 0x18b254dec _dispatch_lane_serial_drain$VARIANT$armv81 + 548
42 libdispatch.dylib 0x18b25592c _dispatch_lane_invoke$VARIANT$armv81 + 408
43 libdispatch.dylib 0x18b25de08 _dispatch_workloop_worker_thread + 584
44 libsystem_pthread.dylib 0x18b459114 _pthread_wqthread + 304
45 libsystem_pthread.dylib 0x18b45bcd4 start_wqthread + 4
Thread firebase/quickstart-ios#4
Thread
0 libsystem_kernel.dylib 0x18b3d6b74 __workq_kernreturn + 8
1 libsystem_pthread.dylib 0x18b4591f8 _pthread_wqthread + 532
2 libsystem_pthread.dylib 0x18b45bcd4 start_wqthread + 4
Thread firebase/quickstart-ios#5
Thread
0 libsystem_kernel.dylib 0x18b3d6b74 __workq_kernreturn + 8
1 libsystem_pthread.dylib 0x18b4591f8 _pthread_wqthread + 532
2 libsystem_pthread.dylib 0x18b45bcd4 start_wqthread + 4
Thread firebase/quickstart-ios#6
Thread
0 libsystem_pthread.dylib 0x18b45bcd0 start_wqthread + 190
Thread firebase/quickstart-ios#7
Thread
0 libsystem_kernel.dylib 0x18b3d7fd8 poll + 8
1 MyApp 0x10537f04c (anonymous namespace)::ExitTimeoutWatcher(void*) + 4344787020
2 libsystem_pthread.dylib 0x18b4582c0 _pthread_body + 128
3 libsystem_pthread.dylib 0x18b458220 _pthread_start + 44
4 libsystem_pthread.dylib 0x18b45bcdc thread_start + 4
RLMRealm notification listener
RLMRealm notification listener
0 libsystem_kernel.dylib 0x18b3d78f4 kevent + 8
1 Realm 0x1060e3588 realm::_impl::ExternalCommitHelper::listen() (external_commit_helper.cpp:217)
2 Realm 0x1060e40d4 void* std::__1::__thread_proxy<std::__1::tuple<std::__1::unique_ptr<std::__1::__thread_struct, std::__1::default_delete<std::__1::__thread_struct> >, realm::_impl::ExternalCommitHelper::ExternalCommitHelper(realm::_impl::RealmCoordinator&)::$_0> >(void*) (tuple:170)
3 libsystem_pthread.dylib 0x18b4582c0 _pthread_body + 128
4 libsystem_pthread.dylib 0x18b458220 _pthread_start + 44
5 libsystem_pthread.dylib 0x18b45bcdc thread_start + 4
com.apple.NSURLConnectionLoader
com.apple.NSURLConnectionLoader
0 libsystem_kernel.dylib 0x18b3cb0f4 mach_msg_trap + 8
1 libsystem_kernel.dylib 0x18b3ca5a0 mach_msg + 72
2 CoreFoundation 0x18b7caa10 __CFRunLoopServiceMachPort + 236
3 CoreFoundation 0x18b7c5920 __CFRunLoopRun + 1360
4 CoreFoundation 0x18b7c50b0 CFRunLoopRunSpecific + 436
5 CFNetwork 0x18bdde74c -[__CoreSchedulingSetRunnable runForever] + 216
6 Foundation 0x18c2bf6a4 __NSThread__start__ + 984
7 libsystem_pthread.dylib 0x18b4582c0 _pthread_body + 128
8 libsystem_pthread.dylib 0x18b458220 _pthread_start + 44
9 libsystem_pthread.dylib 0x18b45bcdc thread_start + 4
Thread firebase/quickstart-ios#8
Thread
0 libsystem_kernel.dylib 0x18b3d6b74 __workq_kernreturn + 8
1 libsystem_pthread.dylib 0x18b459138 _pthread_wqthread + 340
2 libsystem_pthread.dylib 0x18b45bcd4 start_wqthread + 4
Thread firebase/quickstart-ios#9
Thread
0 libsystem_kernel.dylib 0x18b3d6b74 __workq_kernreturn + 8
1 libsystem_pthread.dylib 0x18b4591f8 _pthread_wqthread + 532
2 libsystem_pthread.dylib 0x18b45bcd4 start_wqthread + 4
while detecting barcode, pulsing animation while scanning barcode shows last rectangle momentary before restarting animation.
Am i missing something or is this current behavior?
Appreciate any kind of help!
Thanks,
Raam mohan
I'm trying to use the new Google machine learning sdk, ML Kit, on an Android device that run Android 9. From the official site:
ML Kit makes it easy to apply ML techniques in your apps by bringing Google's ML technologies, such as the Google Cloud Vision API, TensorFlow Lite, and the Android Neural Networks API together in a single SDK. Whether you need the power of cloud-based processing, the real-time capabilities of mobile-optimized on-device models, or the flexibility of custom TensorFlow Lite models, ML Kit makes it possible with just a few lines of code.
I think it means that on a device with at least Android 8.1 (according to the documentation of nnapi) the SDK can uses NNAPI. But when I run the same app on a device with Android 7.1 (where nnapi is not supported) I obtain the same performance of the device that use Android 9 (and in theory the NNAPI). How i can use ML Kit with NNAPI? I am doing something wrong? Link to documentation of mlkit: https://firebase.google.com/docs/ml-kit/
See this sample image :
https://travelandynews.com/wp-content/uploads/2019/09/British-Airways-Amex-credit-Card.jpg
The card number should be detected as : 3759 876543 21001
but it is being detected as 3159 816543 21001
The reticle overlay over the preview image is set to a certain percentage (by default 74% x 8%) of the preview view. The image from the camera is cropped to the same percentage, but the image might not actually have the same aspect ratio as the preview view, which can result in text that is outside the overlay being recognized (or possibly text inside the overlay not being recognized).
Android device: Samsung galaxy A3___
Android OS version: 5.1.1_
Google Play Services version: 14.7.99_
Firebase/Play Services SDK version: _____
api 'com.google.firebase:firebase-invites:16.0.4'
api 'com.google.firebase:firebase-messaging:17.3.4'
api 'com.google.firebase:firebase-core:16.0.4'
api 'com.google.android.gms:play-services-location:16.0.0'
api 'com.google.android.gms:play-services-analytics:16.0.4'
implementation 'com.google.firebase:firebase-ml-vision:18.0.1'
implementation 'com.google.firebase:firebase-ml-vision-face-model:17.0.2'
implementation 'com.google.firebase:firebase-ml-vision-image-label-model:17.0.2'
In most of the devices that I have used, when you turn your head left the Y euler angle is positive and when turning right the angle is negative.
But in a Samsung Galaxy A3 Android 5.1.1 this is inverse, left negative and right positive.
Is there a way to know if the angles are inversed in an specific device?
Y Euler angle:
left negative and right positive.
Y Euler angle:
left positive and right negative.
Hello,
Could you please explain to me why startIfReady() is called in onLayout()?
The app crashes the first time after giving the camera permission.
OS: Android 8.0.0
Device: HTC Desire 12+
Logcat:
2020-07-12 00:09:11.903 8419-8419/com.google.mlkit.showcase.translate E/AndroidRuntime: FATAL EXCEPTION: main
Process: com.google.mlkit.showcase.translate, PID: 8419
java.lang.IllegalStateException: Camera initialization failed.
at com.google.mlkit.showcase.translate.main.MainFragment.bindCameraUseCases(MainFragment.kt:214)
at com.google.mlkit.showcase.translate.main.MainFragment.access$bindCameraUseCases(MainFragment.kt:54)
at com.google.mlkit.showcase.translate.main.MainFragment$onRequestPermissionsResult$1.run(MainFragment.kt:347)
at android.os.Handler.handleCallback(Handler.java:789)
at android.os.Handler.dispatchMessage(Handler.java:98)
at android.os.Looper.loop(Looper.java:164)
at android.app.ActivityThread.main(ActivityThread.java:6565)
at java.lang.reflect.Method.invoke(Native Method)
at com.android.internal.os.Zygote$MethodAndArgsCaller.run(Zygote.java:240)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:767)
After some investigations, it seems that when accepting the camera permission we should initialize the camera by calling setUpCamera instead of calling bindCameraUseCases
if (allPermissionsGranted()) {
viewFinder.post { bindCameraUseCases() }
}
On providing images with texts in same line having large spaces in between them are separated into different blocks. This inreturn messes up order of lines in text extraction. But order of lines is very important in our case.
1.Provide an image with words in same line with large spaces
2.And check output Text , you can see order of lines disordered in large degree
Output
ORIGINAL
L For Recioient
BILL OF SUPPLY
ABD CO
20/0/208
BOS0057
M.G.RAOD, Delhi, Delhi 110099
GSTIN 0TA
State
PAN
434
B Date
07-Delhi
AAECC8220
No
Reference Nio PO.78708
Customer Name
ACC &CO
Customar GSTIN
27A
Place of Supsly
Billing Addres
ACC &CO
Maharashtr
Shipping Addres
ACC&CO
Maharashtra
120
27-Maharashtra
Due Date 24/01/2038
Discount (
Rate / tem (
Total ()
tem
HSN/SAC Quantity
1,809.00
KGS
1.Slag for manufacturing iron
8,678.67
171,30,30,762.64
Total ()
1,30,30,762.64
1,30,30,763.00
0.36
One Crore Thirty Lakh Thirty Thousand Seven Hundred Sity Three Rupees Only
Total Value
Rounding off
Total amount in words)
For ABD Cco
Authorised Signatory
Expected result is line by line extraction.Currently its block by block extraction or is there way to acheive it?
depnedecny used com.google.firebase:firebase-ml-vision:20.0.0'(OnDeviceTextRecogniser)
I would like to use Firebase ML Kit custom models, and run into a proguard obfuscating issue when initializing FirebaseModelInterpreter
. Any groguard guideline/example would be appreciated.
I have added these rules
-keepnames class com.google.firebase.** { *; }
-keepnames class com.google.android.gms.** { *; }
In build.gradle, set minifyEnabled true
to enable Proguard
logcat
output, etc.The crash log
java.lang.UnsatisfiedLinkError: No implementation found for java.lang.String r.c.a.e.b() (tried Java_r_c_a_e_b and Java_r_c_a_e_b__)
at r.c.a.e.b(Native Method)
at com.google.android.gms.internal.firebase_ml.zzpe.<clinit>(SourceFile:1)
at com.google.firebase.ml.custom.FirebaseModelInterpreter.<init>(SourceFile:16)
at com.google.firebase.ml.custom.FirebaseModelInterpreter.zza(SourceFile:6)
at com.google.firebase.ml.custom.FirebaseModelInterpreter.getInstance(SourceFile:1)
When I added -dontobfuscate
to my proguard rules, the initialization works.
import com.google.android.gms.tasks.Task
import com.google.android.gms.tasks.Tasks
import com.google.firebase.ml.common.FirebaseMLException
import com.google.firebase.ml.common.modeldownload.FirebaseLocalModel
import com.google.firebase.ml.common.modeldownload.FirebaseModelManager
import com.google.firebase.ml.custom.FirebaseModelDataType
import com.google.firebase.ml.custom.FirebaseModelInputOutputOptions
import com.google.firebase.ml.custom.FirebaseModelInputs
import com.google.firebase.ml.custom.FirebaseModelInterpreter
import com.google.firebase.ml.custom.FirebaseModelOptions
fun init(context: Context) {
val modelOptions = FirebaseModelOptions.Builder()
.setLocalModelName("mobilenet_v2_1.0_224_quant")
.build()
val localModelSource = FirebaseLocalModel.Builder(MODEL_NAME).setAssetFilePath("mobilenet_v2_1.0_224_quant.tflite").build()
val firebaseModelManager = FirebaseModelManager.getInstance().apply {
registerLocalModel(localModelSource)
}
interpreter = FirebaseModelInterpreter.getInstance(modelOptions)
labelList = loadLabelList(context.applicationContext)
val inputDims = intArrayOf(DIM_BATCH_SIZE, DIM_IMG_SIZE, DIM_IMG_SIZE, DIM_PIXEL_SIZE)
val outputDims = intArrayOf(1, labelList.size)
val dataType = FirebaseModelDataType.BYTE
dataOptions = FirebaseModelInputOutputOptions.Builder()
.setInputFormat(0, dataType, inputDims)
.setOutputFormat(0, dataType, outputDims)
.build()
}
I believe TextureView performance is way better than SurfaceView specially in fragments. So whats the reason behind using SurfaceView in this app?
Also when I tried to convert it to TextureView the Scanner Reticle in the middle will turn to black!
ScreenShot
problem in ios.
Korean and Japanese are different. The two collide and the output is in Japanese.
android not problem.
Android and IOS have the same settings, but how is the output different?
setting hints:
options.languageHints = @[@"ko",@"en",@"zh-CN",@"ja"];
ios
CODE:
FIRVision *vision = [FIRVision vision];
FIRVisionCloudTextRecognizerOptions *options = [[FIRVisionCloudTextRecognizerOptions alloc] init];
//kor,@"en",@"ja",@"zh-CN"
options.languageHints = @[@"ko",@"en",@"zh-CN"];
FIRVisionTextRecognizer *textRecognizer = [vision cloudTextRecognizerWithOptions:options];
FIRVisionImage *image = [[FIRVisionImage alloc] initWithImage:img];
[textRecognizer processImage:image
completion:^(FIRVisionText *_Nullable result, NSError *_Nullable error) {
[self stopIndicator];
if(error != nil || result == nil) {
...
Note : ios deletes ja and ko is read.
After trying to implement library with cocoapods to an existing project I get following errors:
Ld /Users/esben/Library/Developer/Xcode/DerivedData/IntoWords-apsxgnmsdmgceqcnwfxgnosdzprw/Build/Products/Debug-iphoneos/DictionaryShare.appex/DictionaryShare normal arm64 (in target 'DictionaryShare' from project 'IntoWords')
cd /Users/esben/Mikrov/git/IntoWords-iOS
/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/clang -target arm64-apple-ios11.4 -isysroot /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS13.5.sdk -L/Users/esben/Library/Developer/Xcode/DerivedData/IntoWords-apsxgnmsdmgceqcnwfxgnosdzprw/Build/Products/Debug-iphoneos -F/Users/esben/Library/Developer/Xcode/DerivedData/IntoWords-apsxgnmsdmgceqcnwfxgnosdzprw/Build/Products/Debug-iphoneos -F/Users/esben/Library/Developer/Xcode/DerivedData/IntoWords-apsxgnmsdmgceqcnwfxgnosdzprw/Build/Products/Debug-iphoneos/Alamofire -F/Users/esben/Library/Developer/Xcode/DerivedData/IntoWords-apsxgnmsdmgceqcnwfxgnosdzprw/Build/Products/Debug-iphoneos/FirebaseCore -F/Users/esben/Library/Developer/Xcode/DerivedData/IntoWords-apsxgnmsdmgceqcnwfxgnosdzprw/Build/Products/Debug-iphoneos/FirebaseCoreDiagnostics -F/Users/esben/Library/Developer/Xcode/DerivedData/IntoWords-apsxgnmsdmgceqcnwfxgnosdzprw/Build/Products/Debug-iphoneos/FirebaseCrashlytics -F/Users/esben/Library/Developer/Xcode/DerivedData/IntoWords-apsxgnmsdmgceqcnwfxgnosdzprw/Build/Products/Debug-iphoneos/FirebaseInstallations -F/Users/esben/Library/Developer/Xcode/DerivedData/IntoWords-apsxgnmsdmgceqcnwfxgnosdzprw/Build/Products/Debug-iphoneos/GTMSessionFetcher -F/Users/esben/Library/Developer/Xcode/DerivedData/IntoWords-apsxgnmsdmgceqcnwfxgnosdzprw/Build/Products/Debug-iphoneos/GoogleDataTransport -F/Users/esben/Library/Developer/Xcode/DerivedData/IntoWords-apsxgnmsdmgceqcnwfxgnosdzprw/Build/Products/Debug-iphoneos/GoogleDataTransportCCTSupport -F/Users/esben/Library/Developer/Xcode/DerivedData/IntoWords-apsxgnmsdmgceqcnwfxgnosdzprw/Build/Products/Debug-iphoneos/GoogleToolboxForMac -F/Users/esben/Library/Developer/Xcode/DerivedData/IntoWords-apsxgnmsdmgceqcnwfxgnosdzprw/Build/Products/Debug-iphoneos/GoogleUtilities -F/Users/esben/Library/Developer/Xcode/DerivedData/IntoWords-apsxgnmsdmgceqcnwfxgnosdzprw/Build/Products/Debug-iphoneos/GoogleUtilitiesComponents -F/Users/esben/Library/Developer/Xcode/DerivedData/IntoWords-apsxgnmsdmgceqcnwfxgnosdzprw/Build/Products/Debug-iphoneos/PromisesObjC -F/Users/esben/Library/Developer/Xcode/DerivedData/IntoWords-apsxgnmsdmgceqcnwfxgnosdzprw/Build/Products/Debug-iphoneos/Protobuf -F/Users/esben/Library/Developer/Xcode/DerivedData/IntoWords-apsxgnmsdmgceqcnwfxgnosdzprw/Build/Products/Debug-iphoneos/nanopb -F/Users/esben/Mikrov/git/IntoWords-iOS/Pods/MLKitBarcodeScanning/Frameworks -F/Users/esben/Mikrov/git/IntoWords-iOS/Pods/MLKitCommon/Frameworks -F/Users/esben/Mikrov/git/IntoWords-iOS/Pods/MLKitFaceDetection/Frameworks -F/Users/esben/Mikrov/git/IntoWords-iOS/Pods/MLKitImageLabeling/Frameworks -F/Users/esben/Mikrov/git/IntoWords-iOS/Pods/MLKitImageLabelingCommon/Frameworks -F/Users/esben/Mikrov/git/IntoWords-iOS/Pods/MLKitImageLabelingCustom/Frameworks -F/Users/esben/Mikrov/git/IntoWords-iOS/Pods/MLKitObjectDetection/Frameworks -F/Users/esben/Mikrov/git/IntoWords-iOS/Pods/MLKitObjectDetectionCommon/Frameworks -F/Users/esben/Mikrov/git/IntoWords-iOS/Pods/MLKitObjectDetectionCustom/Frameworks -F/Users/esben/Mikrov/git/IntoWords-iOS/Pods/MLKitTextRecognition/Frameworks -F/Users/esben/Mikrov/git/IntoWords-iOS/Pods/MLKitVision/Frameworks -F/Users/esben/Mikrov/git/IntoWords-iOS/Pods/MLKitVisionKit/Frameworks -filelist /Users/esben/Library/Developer/Xcode/DerivedData/IntoWords-apsxgnmsdmgceqcnwfxgnosdzprw/Build/Intermediates.noindex/IntoWords.build/Debug-iphoneos/DictionaryShare.build/Objects-normal/arm64/DictionaryShare.LinkFileList -Xlinker -rpath -Xlinker /usr/lib/swift -Xlinker -rpath -Xlinker @executable_path/Frameworks -Xlinker -rpath -Xlinker @loader_path/Frameworks -Xlinker -rpath -Xlinker @executable_path/../../Frameworks -Xlinker -rpath -Xlinker @executable_path/Frameworks -Xlinker -rpath -Xlinker @executable_path/../../Frameworks -dead_strip -Xlinker -object_path_lto -Xlinker /Users/esben/Library/Developer/Xcode/DerivedData/IntoWords-apsxgnmsdmgceqcnwfxgnosdzprw/Build/Intermediates.noindex/IntoWords.build/Debug-iphoneos/DictionaryShare.build/Objects-normal/arm64/DictionaryShare_lto.o -Xlinker -export_dynamic -Xlinker -no_deduplicate -fembed-bitcode-marker -fapplication-extension -fobjc-link-runtime -L/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/lib/swift/iphoneos -L/usr/lib/swift -Xlinker -add_ast_path -Xlinker /Users/esben/Library/Developer/Xcode/DerivedData/IntoWords-apsxgnmsdmgceqcnwfxgnosdzprw/Build/Intermediates.noindex/IntoWords.build/Debug-iphoneos/DictionaryShare.build/Objects-normal/arm64/DictionaryShare.swiftmodule -ObjC -lc++ -lsqlite3 -lz -framework AVFoundation -framework Accelerate -framework Alamofire -framework CFNetwork -framework CoreGraphics -framework CoreImage -framework CoreLocation -framework CoreMedia -framework CoreVideo -framework FBLPromises -framework FirebaseCrashlytics -framework FirebaseInstallations -framework Foundation -framework GoogleUtilities -framework GoogleUtilitiesComponents -framework LocalAuthentication -framework MLKitBarcodeScanning -framework MLKitCommon -framework MLKitFaceDetection -framework MLKitImageLabeling -framework MLKitImageLabelingCommon -framework MLKitImageLabelingCustom -framework MLKitObjectDetection -framework MLKitObjectDetectionCommon -framework MLKitObjectDetectionCustom -framework MLKitTextRecognition -framework MLKitVision -framework MLKitVisionKit -framework Protobuf -framework Security -framework SystemConfiguration -framework UIKit -framework nanopb -e _NSExtensionMain -framework SharedFramework -framework Pods_DictionaryShare -Xlinker -dependency_info -Xlinker /Users/esben/Library/Developer/Xcode/DerivedData/IntoWords-apsxgnmsdmgceqcnwfxgnosdzprw/Build/Intermediates.noindex/IntoWords.build/Debug-iphoneos/DictionaryShare.build/Objects-normal/arm64/DictionaryShare_dependency_info.dat -o /Users/esben/Library/Developer/Xcode/DerivedData/IntoWords-apsxgnmsdmgceqcnwfxgnosdzprw/Build/Products/Debug-iphoneos/DictionaryShare.appex/DictionaryShare
Undefined symbols for architecture arm64:
"_OBJC_CLASS_$_GTMLogMininumLevelFilter", referenced from:
objc-class-ref in MLKitCommon(GIPLoggingReroutingGTMLogger_4bb39274d070a62f02bcedfcf884786c.o)
"_OBJC_METACLASS_$_GTMLogger", referenced from:
_OBJC_METACLASS_$_MLKITx_GIPLoggingReroutingGTMLogger in MLKitCommon(GIPLoggingReroutingGTMLogger_4bb39274d070a62f02bcedfcf884786c.o)
"_OBJC_CLASS_$_GTMSessionFetcher", referenced from:
objc-class-ref in MLKitCommon(CCTClearcutUploader_88b80034c9cfd533c7ff737cf3d8f171.o)
objc-class-ref in MLKitCommon(PHTHeterodyneSyncer_33e5c01e6d360fc8d44bfa5aa1cf3f38.o)
objc-class-ref in MLKitCommon(PHTInternalHeterodyneSyncer_b4678a6e3e57b0f4fd5c08a59a71f80a.o)
"_OBJC_CLASS_$_GTMSessionFetcherService", referenced from:
objc-class-ref in MLKitCommon(MLKModelDownloader_9c43dec80e40e74b82e5f6cf057d9d58.o)
objc-class-ref in MLKitVision(GMVCloudVisionClient_53c7716fb19c7008d8e3d8705d5d50e9.o)
"_OBJC_CLASS_$_GTMSessionCookieStorage", referenced from:
objc-class-ref in MLKitCommon(CCTClearcutUploader_88b80034c9cfd533c7ff737cf3d8f171.o)
"_OBJC_CLASS_$_GTMLogger", referenced from:
objc-class-ref in MLKitCommon(CCTClearcutMetaLogger_6adfb680567c3bb8c12fcf8136e6ad6e.o)
objc-class-ref in MLKitCommon(PHTFlatFilePhenotype_d5874f1b858c68ba33b7a65c87d8bdff.o)
objc-class-ref in MLKitCommon(CCTClearcutUploader_88b80034c9cfd533c7ff737cf3d8f171.o)
objc-class-ref in MLKitCommon(PHTURL_219ca1e11ae6a76a890934900168db71.o)
objc-class-ref in MLKitCommon(CCTLogWriter_2fc7de2c5784d0f31036f579b9be5c82.o)
objc-class-ref in MLKitCommon(CCTClearcutLogEvent_21ff928afc0c2ec2572977c486310fcd.o)
objc-class-ref in MLKitCommon(GIPPseudonymousIDStore_2f2b28df7320415a6c1c95341260745f.o)
...
"_kGTMSessionFetcherStatusDomain", referenced from:
___69-[MLKModelDownloader beginModelDownloadWithURL:modelInfo:conditions:]_block_invoke.269 in MLKitCommon(MLKModelDownloader_9c43dec80e40e74b82e5f6cf057d9d58.o)
___76-[MLKITx_PHTHeterodyneSyncer syncAccount:syncedScopes:fetchReason:callback:]_block_invoke_2 in MLKitCommon(PHTHeterodyneSyncer_33e5c01e6d360fc8d44bfa5aa1cf3f38.o)
___91-[MLKITx_PHTInternalHeterodyneSyncer syncHoldingLockWithSyncedScopes:fetchReason:callback:]_block_invoke_2 in MLKitCommon(PHTInternalHeterodyneSyncer_b4678a6e3e57b0f4fd5c08a59a71f80a.o)
___63-[MLKITx_GMVCloudVisionClient initWithCloudUri:apiKey:options:]_block_invoke in MLKitVision(GMVCloudVisionClient_53c7716fb19c7008d8e3d8705d5d50e9.o)
ld: symbol(s) not found for architecture arm64
clang: error: linker command failed with exit code 1 (use -v to see invocation)
It appears that no more than three 2d barcodes are able to be detected in a given image. When passing preview frames to the barcode detector that contain more than three 2d barcodes only three barcodes are returned. Detection sort of bounces around across all of the 2d barcodes but only three can be scanned/tracked at any given time. Conversely, six to eight 1d barcodes seem to be detected/tracked just fine at any given time.
Is this a limitation of the barcode detector given the complexity of 2d barcodes or is this potentially a known issue that can be remediated?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.