硬件篇(一)"/>
实用硬件篇(一)
一、二维码扫描
1、关键点
(0)框架:AVFoundation
(1)输入设备:摄像头
(2)输出设备:元数据,将二维码解析成字符串输出
(3)建立会话session :通过“添加”,将输入和输出联系起来
(4)预览视图layer : 特殊的layer,专门用来显示输入设备捕捉到的画面
(5)session需要开启和关闭
2、二维码的实质:字符串
3、输出设备output的代理AVCaptureMetadataOutputObjectsDelegate(就一个代理方法)
(1)解析完毕之后就会调用返回一个字符串
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputMetadataObjects:(NSArray *)metadataObjects fromConnection:(AVCaptureConnection *)connection;
(2)设置代理的方式很特殊
[self.output setMetadataObjectsDelegate:self queue:dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0)];
4、配置示例
#import "ViewController.h"
#import <AVFoundation/AVFoundation.h>@interface ViewController ()<AVCaptureMetadataOutputObjectsDelegate>//1.创建输入设备
@property (nonatomic, strong)AVCaptureDeviceInput *deviceInput;//2.创建输出设备 元数据
@property (nonatomic, strong) AVCaptureMetadataOutput *output;//3.创建会话
@property (nonatomic, strong) AVCaptureSession *session;//4.预览视图layer特殊的layer专门来显示输入设备捕捉到的画面
@property (nonatomic, strong) AVCaptureVideoPreviewLayer *preViewLayer;@end@implementation ViewController- (void)viewDidLoad {[super viewDidLoad];//1.创建输入设备:可以包括摄像头、键盘、鼠标、麦克风//摄像头有可能有两个AVCaptureDevice *deviceC = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];self.deviceInput = [[AVCaptureDeviceInput alloc]initWithDevice:deviceC error:nil];//2.创建输出设备 元数据self.output = [[AVCaptureMetadataOutput alloc]init];// 输出设备设置代理[self.output setMetadataObjectsDelegate:self queue:dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0)];//3.创建会话self.session = [[AVCaptureSession alloc]init];//4.建立联系(判断是否可以添加输入和输出设备)if ([self.session canAddInput:self.deviceInput]) {[self.session addInput:self.deviceInput];}if ([self.session canAddOutput:self.output]) {[self.session addOutput:self.output];}//设置元数据类型为“二维码”,还有很多类型,比如条形码[self.output setMetadataObjectTypes:@[AVMetadataObjectTypeQRCode]];NSLog(@"%@",self.output.availableMetadataObjectTypes);//视频输出质量[self.session setSessionPreset:AVCaptureSessionPresetHigh];//5.开启会话[self.session startRunning];//7.增加一个预览视图layer 来展示输入设备的画面self.preViewLayer = [[AVCaptureVideoPreviewLayer alloc]initWithSession:self.session];self.preViewLayer.frame = self.view.bounds;//添加到视图[self.view.layer addSublayer:self.preViewLayer];}//6.解析完毕之后就会调用:返回一个字符串
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputMetadataObjects:(NSArray *)metadataObjects fromConnection:(AVCaptureConnection *)connection;
{//取出扫描之后返回的数据AVMetadataMachineReadableCodeObject *objc = [metadataObjects firstObject];NSLog(@"%@",objc.stringValue);//停止会话[self.session stopRunning];[self.preViewLayer removeFromSuperlayer];
}@end
二、二维码生成
1、使用框架:CoreImage
2、核心工具:滤镜 CIFilter
3、配置示例
#import "ViewController.h"
#import <CoreImage/CoreImage.h>@interface ViewController ()
@property (weak, nonatomic) IBOutlet UIImageView *imageView;@end@implementation ViewController- (void)viewDidLoad {[super viewDidLoad];//滤镜//获取内置滤镜支持的种类:二维码NSLog(@"%@",[CIFilter filterNamesInCategory:kCICategoryBuiltIn]);CIFilter *filter = [CIFilter filterWithName:@"CIQRCodeGenerator"];//设置默认值[filter setDefaults];//封装数据 字符串NSLog(@"%@",filter.inputKeys);[filter setValue:[@"" dataUsingEncoding:NSUTF8StringEncoding] forKey:@"inputMessage"];//放大原图片,使得二维码图片更清晰CIImage *resultImage = [filter.outputImage imageByApplyingTransform:CGAffineTransformMakeScale(5, 5)];//将滤镜生成的图片赋值给UI控件self.imageView.image = [UIImage imageWithCIImage:resultImage];}@end
三、加速计
1、作用:用于检测设备的运动(比如摇晃)
2、加速计原理
(1)检测设备在X、Y、Z轴上的加速度 (哪个方向有力的作用,哪个方向运动了)
(2)根据加速度数值,就可以判断出在各个方向上的作用力度
3、加速计程序的开发
(1)在iOS4以前:使用UIAccelerometer,用法非常简单(到了iOS5就已经过期,UIKit框架)
(2)从iOS4开始:CoreMotion.framework
(3)虽然UIAccelerometer已经过期,但由于其用法极其简单,很多程序里面都还有残留
4、关于CoreMotion
(1)随着iPhone4的推出,加速度计全面升级,并引入了陀螺仪,与Motion(运动)相关的编程成为重头戏,苹果特地在iOS4中增加了专门处理Motion的框架-CoreMotion.framework
(2)Core Motion不仅能够提供实时的加速度值和旋转速度值,更重要的是,苹果在其中集成了很多牛逼的算法
5、UIAccelerometer使用示例(代理方法获得加速计返回的值)
@interface ViewController () <UIAccelerometerDelegate>
@property (nonatomic, strong) UIAccelerometer *accelerometer;
@end@implementation ViewController- (void)viewDidLoad {[super viewDidLoad];//检测手机三个方向上的加速度值,注意y的方向向上为正//重力加速度//加速计设备是一个硬件设备,单例UIAccelerometer *accelerometer = [UIAccelerometer sharedAccelerometer];//设置采样间隔accelerometer.updateInterval = 1/30.0;//设置代理accelerometer.delegate = self;self.accelerometer = accelerometer;}
//更新了加速度的值就会调用此方法: 参数1 加速计 参数2 返回值
- (void)accelerometer:(UIAccelerometer *)accelerometer didAccelerate:(UIAcceleration *)acceleration
{NSLog(@"%f %f %f",acceleration.x,acceleration.y,acceleration.z);}@end
6、CoreMotion框架实现加速计案例:小球滚动
(1)导入框架,创建CMMotionManager
(2)判断加速计是否可用(最好判断):isAccelerometerAvailable
(3)开始采样:startAccelerometerUpdatesToQueue ,采样结果在回调中:accelerometerData
(4)案例源码 (类扩展、工具类、图片、音效文件没有展示)
#import "ViewController.h"
#import <CoreMotion/CoreMotion.h>
#import "UIView+Extension.h"
#import "AudioTool.h"
@interface ViewController ()@property(nonatomic,strong) CMMotionManager * manager;
@property(nonatomic,strong) UIImageView * ball ;
@property(nonatomic,assign) CGPoint speed;//记录小球上一刻的位置
@property(nonatomic,assign) CGPoint location;
@end@implementation ViewController- (void)viewDidLoad {[super viewDidLoad];//创建managerCMMotionManager * manager = [[CMMotionManager alloc]init];self.manager = manager;self.manager.accelerometerUpdateInterval = 1/30.0;//创建小球self.ball = [[UIImageView alloc]initWithImage:[UIImage imageNamed:@"bb_ball"]];self.ball.frame = CGRectMake(50, 50, 50, 50);UIImageView * back = [[UIImageView alloc]initWithImage:[UIImage imageNamed:@"bb_background.jpg"]];back.frame = CGRectMake(0, 0, [UIScreen mainScreen].bounds.size.width, [UIScreen mainScreen].bounds.size.height);[self.view addSubview:back];[self.view addSubview:self.ball];//获得加速计数据[manager startAccelerometerUpdatesToQueue:[NSOperationQueue mainQueue] withHandler:^(CMAccelerometerData * _Nullable accelerometerData, NSError * _Nullable error) {//当前时刻的速度,瞬时速度 = 瞬时距离_speed.x += accelerometerData.acceleration.x;_speed.y -= accelerometerData.acceleration.y;//更新小球的位置self.ball.x += _speed.x;self.ball.y += _speed.y;//进行小球的碰撞检测,如果碰到边界,重置小球位置到边界,并且把速度反向,且缩小为一半if (self.ball.x <= 0) {self.ball.x = 0;_speed.x *= -0.5;}if (self.ball.x >= self.view.width - self.ball.width) {self.ball.x = self.view.width - self.ball.width;_speed.x *= -0.5;}if (self.ball.y <= 0) {self.ball.y = 0;_speed.y *= -0.5;}if (self.ball.y >= self.view.height - self.ball.height) {self.ball.y = self.view.height - self.ball.height;_speed.y *= -0.5;}//碰到边界需要发出声音,但是只是碰撞的时候发出一次声音,所以引入locationif (self.ball.x == 0 || self.ball.x == self.view.width - self.ball.width || self.ball.y == self.view.height - self.ball.height || self.ball.y == 0) {if (self.location.x == self.ball.x ||self.location.y == self.ball.y ){}else{//播放声音[AudioTool playWithFileName:@"1.aif"];}}//判断 ,如果在边界上了,但是继续横向滚动,碰到了两边的边界,需要发出声音if (self.ball.y == 0 || self.ball.y == self.view.height - self.ball.height) {if ((self.ball.x == 0) && (self.ball.x != self.location.x)) {[AudioTool playWithFileName:@"1.aif"];}if ((self.ball.x == self.view.width- self.ball.width) && (self.ball.x != self.location.x)) {[AudioTool playWithFileName:@"1.aif"];}}if (self.ball.x == 0 || self.ball.x == self.view.width - self.ball.width) {if ((self.ball.y == 0) && (self.ball.y != self.location.y)) {[AudioTool playWithFileName:@"1.aif"];}if ((self.ball.y == self.view.height- self.ball.height) && (self.ball.y != self.location.y)) {[AudioTool playWithFileName:@"1.aif"];}}//将此刻的小球位置做记录_location.x = self.ball.x;_location.y = self.ball.y;}];}@end
7、摇一摇简单实现
(1)监控摇一摇的方法
方法1:通过分析加速计数据来判断是否进行了摇一摇操作(比较复杂)
方法2:iOS自带的Shake监控API(非常简单,UIKit框架里的)
(2)判断摇一摇的步骤:实现3个摇一摇监听方法
<1> 检测到摇动
- (void)motionBegan:(UIEventSubtype)motion withEvent:(UIEvent *)event
<2> 摇动取消(被中断)
- (void)motionCancelled:(UIEventSubtype)motion withEvent:(UIEvent *)event
<3> 摇动结束
- (void)motionEnded:(UIEventSubtype)motion withEvent:(UIEvent *)event
四、距离传感器
1、位置在听筒上方,常见应用:打电话时,脸部靠近屏幕会自动锁屏,并黑屏。
2、配置示例
#import "ViewController.h"@implementation ViewController- (void)viewDidLoad {[super viewDidLoad];//打开一个距离传感器的开关[UIDevice currentDevice].proximityMonitoringEnabled = YES;//添加一个监听通知[[NSNotificationCenter defaultCenter]addObserver:self selector:@selector(proximityStateDidChange:) name:UIDeviceProximityStateDidChangeNotification object:nil];}
- (void)proximityStateDidChange:(NSNotification *)noti
{if ([UIDevice currentDevice].proximityState) {NSLog(@"手机靠近了");}else{NSLog(@"手机拿远了!");}
}
@end
五、蓝牙(4.0BLE)
1、iOS中蓝牙的实现方案:iOS中提供了4个框架用于实现蓝牙连接
(1)GameKit.framework(用法简单)
只能用于iOS设备之间的连接,多用于游戏(比如五子棋对战),从iOS7开始过期
(2)MultipeerConnectivity.framework
只能用于iOS设备之间的连接,从iOS7开始引入,主要用于文件共享(仅限于沙盒的文件)
(3)ExternalAccessory.framework
可用于第三方蓝牙设备交互,但是蓝牙设备必须经过苹果MFi认证(国内较少)
(4)CoreBluetooth.framework(时下热门)
<1> 可用于第三方蓝牙设备交互,必须要支持蓝牙4.0
<2> 硬件至少是4s,系统至少是iOS6
<3>蓝牙4.0以低功耗著称,一般也叫BLE(Bluetooth Low Energy)
<4>目前应用比较多的案例:运动手坏、嵌入式设备、智能家居
2、Core Bluetooth
(1)核心结构图
(2)蓝牙基本常识
<1> 每个蓝牙4.0设备都是通过服务(Service)和特征(Characteristic)来展示自己的
<2> 一个设备必然包含一个或多个服务,每个服务下面又包含若干个特征
<3> 特征是与外界交互的最小单位:
比如说,一台蓝牙4.0设备,用特征A来描述自己的出厂信息,用特征B来收发数据
<4>服务和特征都是用UUID来唯一标识的,通过UUID就能区别不同的服务和特征
<5>设备里面各个服务(service)和特征(characteristic)的功能,均由蓝牙设备硬件厂商提供,比如哪些是用来交互(读写),哪些可获取模块信息(只读)等
(3)Core Bluetooth的开发步骤
<1>建立中心设备
<2>扫描外设(Discover Peripheral)
<3>连接外设(Connect Peripheral)
<4>扫描外设中的服务和特征(Discover Services And Characteristics)
<5>利用特征与外设做数据交互(Explore And Interact)
<6>断开连接(Disconnect)
3、配置示例(模拟器无法运行)
#import "ViewController.h"
#import <CoreBluetooth/CoreBluetooth.h>@interface ViewController ()<CBCentralManagerDelegate,CBPeripheralDelegate>
@property (nonatomic, strong) CBCentralManager *centralManager;
@property (nonatomic, strong) NSMutableArray *peripherals;
@property (nonatomic, strong) CBCharacteristic *characteristic;
@end@implementation ViewController- (void)viewDidLoad {[super viewDidLoad];//创建蓝牙管理类CBCentralManager *centralManager = [[CBCentralManager alloc]initWithDelegate:self queue:dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0)];self.centralManager = centralManager;}
//选择一个设备 开始连接
- (void)btnClickDidselect
{CBPeripheral *peripheral = self.peripherals[2];[self.centralManager connectPeripheral:peripheral options:nil];
}- (void)btnClick
{//扫描外设//uuid 外设的服务或者特征唯一标识,由设备厂商提供,在说明书中//XXXXXX 服务下面 有可能有很多特征 TTTTTT WWWWWWWCBUUID *uuid = [CBUUID UUIDWithString:@"XXXXXX"];[self.centralManager scanForPeripheralsWithServices:@[uuid] options:nil];}- (void)btnClickOnSwitch
{//记录之前这个特征所在的外设,然后调用读写方法,进行数据的传输//外设会根据数据做出相应的反馈
// - (void)writeValue:(NSData *)data forCharacteristic:(CBCharacteristic *)characteristic type:(CBCharacteristicWriteType)type;//self.characteristic
}#pragma mark CBCentralManagerDelegate
//发现外部设备的时候就会调用
//参数1 中心设备 参数2 外设
- (void)centralManager:(CBCentralManager *)central didDiscoverPeripheral:(CBPeripheral *)peripheral advertisementData:(NSDictionary<NSString *,id> *)advertisementData RSSI:(NSNumber *)RSSI
{if (![self.peripherals containsObject:peripheral]) {[self.peripherals addObject:peripheral];}
}//连接外部设备成功就会调用
//参数1 中心设备 参数2 外部设备
- (void)centralManager:(CBCentralManager *)central didConnectPeripheral:(CBPeripheral *)peripheral
{//peripheral 扫描服务 或者 特征[peripheral discoverServices:@[[CBUUID UUIDWithString:@"XXXXXX"]]];peripheral.delegate = self;
}
#pragma mark - CBPeripheralDelegate//扫描到外设中的服务的时候就会调用
- (void)peripheral:(CBPeripheral *)peripheral didDiscoverServices:(NSError *)error
{for (CBService *service in peripheral.services) {[peripheral discoverCharacteristics:@[[CBUUID UUIDWithString:@"TTTTTT"]] forService:service];}//例如蓝牙灯泡外设 A 1(开关) 2(亮度) 3(冷暖光) 4(颜色) B 5(能耗) 6(定时) 7 8}
- (void)peripheral:(CBPeripheral *)peripheral didDiscoverCharacteristicsForService:(CBService *)service error:(NSError *)error
{for (CBCharacteristic *characteristic in service.characteristics) {if ([characteristic.UUID.UUIDString isEqualToString:@"TTTTTT"]) {self.characteristic = characteristic;}}
}
@end
转载于:.html
更多推荐
实用硬件篇(一)
发布评论