Swift Modules for React Native

React Native is an Objective-C application framework that bridges JavaScript applications running in the JSCore JavaScript engine to iOS and Android native APIs.

In theory,you write your application logic in JSX and ES6/7 and transpile it to JavaScript,and the application framework loads all that as a bundle.

In practice,you will want to expose your own custom native code to your JavaScript application. You may want to provide access to 3rd party library APIs or iOS framework features that aren’t exposed (yet) by React Native.

React Native is written in Objective-C,but we can write modules in Swift and expose them to our applications. The documentation on the React Native site briefly talks about “Exporting Swift,” but is thin on the details of doing much of anything in Swift. In this article,we’ll do a deeper dive into interfacing Swift to JavaScript.

It is assumed you already have React Native and its prerequisites installed on your system. You can find the code used in this article in this GitHub repository:https://github.com/ModusCreateOrg/swift-modules-for-react-native.

Create a React Native project and open it in Xcode:

$ react-native init SwiftBridge && cd 
$ ls
android/         index.iosjs     node_modules
indexandroidjs ios             packagejson
$ ls ios
SwiftBridge         xcodeproj SwiftBridgeTests
$ open ios/xcodeproj

Click the run button in Xcode and see that the project builds and runs.

At this point,we know that if the application fails to compile or run,it is something we’ve done to the project at fault.

We’ll start by implementingthe CalendarManager sample code from the React Native docsand see that it works.

First,we need to add our Swift source file. Right click on SwiftBridge and select “New File…”:

Choose iOS and Swift File from the “Choose template” dialog:

Choose a filename for the file:

You will be asked if you would like to configure an Objective-C bridging header. We will need this,so click on the Create button:

We copy the CalendarManager.swift example code from the React Native docs page and paste it into our CalendarManager.swift file:

//
//  CalendarManager.swift//  SwiftBridge//  Created by Michael Schwartz on 12/11/15.//  Copyright © 2015 Facebook. All rights reserved.

importFoundation

// CalendarManager.swift

@objc(CalendarManager)
class:NSObject{
  
  @objc func addEvent(nameString, location dateNSNumber->Void
    // Date is ready to use!
  }
  
}

Unfortunately,we have to provide an Objective-C file that exposes our Swift to the React Native Objective-C framework. Create the file “CalendarManageBridge.m” by selecting “New File” as before and choose Objective-C File this time:

This time you will be presented with a “Choose options for your new file” dialog:

Enter “CalendarManagerBridge” in the File text field and click the Next button. Click the Create button on the next dialog:

The file is added to your project.

Copy and paste the code from the React Native docs page to this file:

//  CalendarManagerBridge.m#import 
 
 
  
   
  
  // CalendarManagerBridge.m
  
  #import "RCTBridgeModule.h"
  
   
  
  @interface
  
   RCT_EXTERN_MODULE
  
  (
  
  CalendarManager
  
   
  
  NSObject
  
  )
  
   RCT_EXTERN_METHOD
  
  (
  
  addEvent
  
  :(
  
  NSString
  
   
  
  *)
  
  name location
  
  location date
  
  NSNumber
  
  date
  
  @end
 
 

Finally,we edit the SwiftBridge-Bridging-Header.h file and copy the two lines from the React Native docs page there:

//  Use this file to import your target's public headers that you would like to expose to Swift.// CalendarManager-Bridging-Header.h#import "RCTBridgeModule.h"

Click on the run button in Xcode again and the project should run. If not,you did something wrong in the above steps.

varReact= require(‘reactnative’); // after this line consoledirReact.NativeModules); // ← add this line

When we run the project from Xcode with this line,we will get a red screen error:

It seems console.dir() is only present if we’re debugging via Chrome. Click the Dismiss (ESC) link on the red screen,then from the Simulator’s Hardware menu,choose Shake Gesture and choose “Debug in Chrome” from the action sheet that appears in the emulator window:

You should see a new window or tab in Chrome that looks something like this:

After hitting Command-Shift-J as the page suggests,you should see something like this:

Note that the console.dir() did work and you can expand the Object to see that our React.NativeModule.CalendarManager object is exposed to JavaScript and it contains the addEvent() method as we expect.

Let’s implement some code in the addEvent() method to see that we can call it from JavaScript and access the arguments passed to Swift. Edit CalendarManager.swift so it looks like this:

    NSLog("%@ %@ %S" nameAll that’s really changed is the NSLog() call to dump the passed variables. Let’s also add a call to the addEvent() method to index.ios.js,just after the console.dir():

addEvent(‘One’,125);">‘Two 3);

When we run this,and the application crashes. There is an error reported in both Chrome Dev Tools and Xcode and in the simulator.

Note that the NSLog() did work,but the React Native framework displayed the red screen.

The fix for this is to add “nonnull” to the CalendarManagerBridge.m file:

RCT_EXTERN_METHOD:(NSString*)name locationlocation date:(nonnull NSNumberdate)

With this change,the app works without any errors. We can also see in the Xcode console the NSLog() output:

(Note: I created a GitHub issue about this problem with the React Native documentation page and it has been fixed).

We have verified we can access the arguments passed to our Swift method from JavaScript.

We cannot simply return values to JavaScript because React Native’s JavaScript/Native bridge is asynchronous. That is,you have to implement your Swift method with a callback parameter and call it with a callback function from JavaScript,or you may implement events.

Let’s examine the callback mechanism first. Change the RCT_EXTERN_METHOD line in CalendarManagerBridge.m to read:

date callbackRCTResponseSenderBlockcallbackThis adds a 4th parameter to the method,a callback function. In CalendarManager.swift,we need to alter the the addEvent() method:

 callbackNSObject()
 
 "%@ %@ %@"
 callback[[
   "name""location""date" date
 ]])
What this version of addEvent() does is call the callback() method with a JavaScript Object that has the argument names/values as key/value pairs. In Swift,we create an NSObject with the [ key: value ] syntax. The argument to the callback from Swift is an array of argument values. In this case,we have just the one Object.

We need to modify the JavaScript code in index.ios.js to pass a callback. It should look like this:

"One" "Two"functiono
    consolelogInCallback});

When we run this version of the code in the simulator,this is displayed in the JavaScript debugger console:

In order to use events,we need to modify the ‘CalendarManager-Bridging-Header.h’ file to import additional headers from React Native. The file should read:



#import "RCTBridge.h"#import "RCTBridgeModule.h"#import "RCTEventDispatcher.h"

An RCTBridge instance contains an eventDispatcher that we can use to send events to JavaScript from Swift. In order to get this instance,we can have one synthesized for us in our CalendarManager class. We can also verify that it is synthesized by using NSLog() to dump its value.

Modify the class’ code in CalendarManager.swift so it looks like this:

  
   bridgeRCTBridge!// this is synthesized

  "Bridge: %@"selfbridge
    callback
       date
      There is the bridge member that will be synthesized and in the addEvent() method there is a call to NSLog() to print the value of the bridge. The value printed should be some hex number that’s the address of the bridge instance.

When we run the code,we can see that the bridge member is synthesized:

Note that the code uses NSLog() instead of print(). NSLog is synchronized and works better with threading. It also prints a timestamp and prints to the device console when running on device.

Ultimately debug logging should be wrapped by some other means so the printing can be disabled or directed as you want.

Modify the CalenderManager class one more time so it looks like this:

    let ret   []([ret])eventDispatchersendAppEventWithName"EventReminder" body retThe sendAppEventWithName() method takes an event name and an arbitrary object that is sent to the JavaScript event handler. In the code above,we’re assigning the NSObject with arguments as key/value pairs to a variable and using it to pass to both the callback() and the event argument.

Modify the JavaScript code near the top of index.ios.js so it reads:

'react-native' subscription NativeAppEventEmitteraddListener
    'EventReminder'
    reminder=>
        consoleEVENT’)'name: '+ reminder'location: 'location'date: '
'In Callback'})

We’re really just adding the subscription logic before calling addEvent(). When we run this version of the code,we see the expected output in the Chrome console:

func constantsToExportreturn"x"1"y"2"z""Arbitrary string"When we run the project and expand the first Object printed in the Chrome console,we see our constants:

In conclusion,we now have patterns to interface Swift native code to our JavaScript in React Native. From here we can implement our application logic in either language,as appropriate.


From: http://moduscreate.com/swift-modules-for-react-native/

版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。

相关推荐


软件简介:蓝湖辅助工具,减少移动端开发中控件属性的复制和粘贴.待开发的功能:1.支持自动生成约束2.开发设置页面3.做一个浏览器插件,支持不需要下载整个工程,可即时操作当前蓝湖浏览页面4.支持Flutter语言模板生成5.支持更多平台,如Sketch等6.支持用户自定义语言模板
现实生活中,我们听到的声音都是时间连续的,我们称为这种信号叫模拟信号。模拟信号需要进行数字化以后才能在计算机中使用。目前我们在计算机上进行音频播放都需要依赖于音频文件。那么音频文件如何生成的呢?音频文件的生成过程是将声音信息采样、量化和编码产生的数字信号的过程,我们人耳所能听到的声音频率范围为(20Hz~20KHz),因此音频文件格式的最大带宽是20KHZ。根据奈奎斯特的理论,音频文件的采样率一般在40~50KHZ之间。奈奎斯特采样定律,又称香农采样定律。...............
前言最近在B站上看到一个漂亮的仙女姐姐跳舞视频,循环看了亿遍又亿遍,久久不能离开!看着小仙紫姐姐的蹦迪视频,除了一键三连还能做什么?突发奇想,能不能把舞蹈视频转成代码舞呢?说干就干,今天就手把手教大家如何把跳舞视频转成代码舞,跟着仙女姐姐一起蹦起来~视频来源:【紫颜】见过仙女蹦迪吗 【千盏】一、核心功能设计总体来说,我们需要分为以下几步完成:从B站上把小姐姐的视频下载下来对视频进行截取GIF,把截取的GIF通过ASCII Animator进行ASCII字符转换把转换的字符gif根据每
【Android App】实战项目之仿抖音的短视频分享App(附源码和演示视频 超详细必看)
前言这一篇博客应该是我花时间最多的一次了,从2022年1月底至2022年4月底。我已经将这篇博客的内容写为论文,上传至arxiv:https://arxiv.org/pdf/2204.10160.pdf欢迎大家指出我论文中的问题,特别是语法与用词问题在github上,我也上传了完整的项目:https://github.com/Whiffe/Custom-ava-dataset_Custom-Spatio-Temporally-Action-Video-Dataset关于自定义ava数据集,也是后台
因为我既对接过session、cookie,也对接过JWT,今年因为工作需要也对接了gtoken的2个版本,对这方面的理解还算深入。尤其是看到官方文档评论区又小伙伴表示看不懂,所以做了这期视频内容出来:视频在这里:本期内容对应B站的开源视频因为涉及的知识点比较多,视频内容比较长。如果你觉得看视频浪费时间,可以直接阅读源码:goframe v2版本集成gtokengoframe v1版本集成gtokengoframe v2版本集成jwtgoframe v2版本session登录官方调用示例文档jwt和sess
【Android App】实战项目之仿微信的私信和群聊App(附源码和演示视频 超详细必看)
用Android Studio的VideoView组件实现简单的本地视频播放器。本文将讲解如何使用Android视频播放器VideoView组件来播放本地视频和网络视频,实现起来还是比较简单的。VideoView组件的作用与ImageView类似,只是ImageView用于显示图片,VideoView用于播放视频。...
采用MATLAB对正弦信号,语音信号进行生成、采样和内插恢复,利用MATLAB工具箱对混杂噪声的音频信号进行滤波
随着移动互联网、云端存储等技术的快速发展,包含丰富信息的音频数据呈现几何级速率增长。这些海量数据在为人工分析带来困难的同时,也为音频认知、创新学习研究提供了数据基础。在本节中,我们通过构建生成模型来生成音频序列文件,从而进一步加深对序列数据处理问题的了解。
基于yolov5+deepsort+slowfast算法的视频实时行为检测。1. yolov5实现目标检测,确定目标坐标 2. deepsort实现目标跟踪,持续标注目标坐标 3. slowfast实现动作识别,并给出置信率 4. 用框持续框住目标,并将动作类别以及置信度显示在框上
数字电子钟设计本文主要完成数字电子钟的以下功能1、计时功能(24小时)2、秒表功能(一个按键实现开始暂停,另一个按键实现清零功能)3、闹钟功能(设置闹钟以及到时响10秒)4、校时功能5、其他功能(清零、加速、星期、八位数码管显示等)前排提示:前面几篇文章介绍过的内容就不详细介绍了,可以看我专栏的前几篇文章。PS.工程文件放在最后面总体设计本次设计主要是在前一篇文章 数字电子钟基本功能的实现 的基础上改编而成的,主要结构不变,分频器将50MHz分为较低的频率备用;dig_select
1.进入官网下载OBS stdioOpen Broadcaster Software | OBS (obsproject.com)2.下载一个插件,拓展OBS的虚拟摄像头功能链接:OBS 虚拟摄像头插件.zip_免费高速下载|百度网盘-分享无限制 (baidu.com)提取码:6656--来自百度网盘超级会员V1的分享**注意**该插件必须下载但OBS的根目录(应该是自动匹配了的)3.打开OBS,选中虚拟摄像头选择启用在底部添加一段视频录制选择下面,进行录制.
Meta公司在9月29日首次推出一款人工智能系统模型:Make-A-Video,可以从给定的文字提示生成短视频。基于**文本到图像生成技术的最新进展**,该技术旨在实现文本到视频的生成,可以仅用几个单词或几行文本生成异想天开、独一无二的视频,将无限的想象力带入生活
音频信号叠加噪声及滤波一、前言二、信号分析及加噪三、滤波去噪四、总结一、前言之前一直对硬件上的内容比较关注,但是可能是因为硬件方面的东西可能真的是比较杂,而且需要渗透的东西太多了,所以学习进展比较缓慢。因为也很少有单纯的硬件学习研究,总是会伴随着各种理论需要硬件做支撑,所以还是想要慢慢接触理论学习。但是之前总找不到切入点,不知道从哪里开始,就一直拖着。最近稍微接触了一点信号处理,就用这个当作切入点,开始接触理论学习。二、信号分析及加噪信号处理选用了matlab做工具,选了一个最简单的语音信号处理方
腾讯云 TRTC 实时音视频服务体验,从认识 TRTC 到 TRTC 的开发实践,Demo 演示& IM 服务搭建。
音乐音频分类技术能够基于音乐内容为音乐添加类别标签,在音乐资源的高效组织、检索和推荐等相关方面的研究和应用具有重要意义。传统的音乐分类方法大量使用了人工设计的声学特征,特征的设计需要音乐领域的知识,不同分类任务的特征往往并不通用。深度学习的出现给更好地解决音乐分类问题提供了新的思路,本文对基于深度学习的音乐音频分类方法进行了研究。首先将音乐的音频信号转换成声谱作为统一表示,避免了手工选取特征存在的问题,然后基于一维卷积构建了一种音乐分类模型。
C++知识精讲16 | 井字棋游戏(配资源+视频)【赋源码,双人对战】
本文主要讲解如何在Java中,使用FFmpeg进行视频的帧读取,并最终合并成Gif动态图。
在本篇博文中,我们谈及了 Swift 中 some、any 关键字以及主关联类型(primary associated types)的前世今生,并由浅及深用简明的示例向大家讲解了它们之间的奥秘玄机。