RPi NFT Sensor
Working with the awesome team at UNBND we built a multi sensor node that sent data to an a unity app to generate NFTs.
Above: Final format Raspberry Pi sensor with camera mic and distance sensors.
About
This project combines sensors procedural and generative digital art with NFTs. It was planned to be a live event (2-4 hours) and an on premise activation for 2 weeks in multiple locations.
Project Status: ON HOLD
Due to COVID-91 the project has be placed on hold. The work completed included a working Proof of Concept sensor and artwork component.
Here you will find all work completed to this point. What is missing is the artwork's creative direction and treatment which was to be coordinated with a featured NFT artist who was TBD at the time of writing.
Tech Flow
This information architecture diagram how the technology flow from the sensors to the app and and back. This diagram represents one raspberry pi (nodeJS) and one art app (Unity) connected via websockets.
Sensors
Raspberry Pi and sensor setup
Connecting to Raspberry Pis
The existing raspberry pi can be connected to with the following command
Osx
$ ssh pi@raspberry.local
Then enter the password
Windows
$ ssh pi@raspberry.locla
Then enter the password
Troubleshooting
If you cant connect ensure SSH has been enabled in the raspi-config settings and that you have the correct host name. If you dint have a hot name you can just use the Pi’s ip address wich can be found with the command $ ifconfig.
Software
There are two main parts to this project the server and the client. The server sits on each raspberry pi and is a socket server that the client connects to. It will transmit the sensor data to all clients when called for. The client is a web socket client that connects to the host name or ip and sends a message command for each sensor which in return the servers will send back sensor data. The commands are as follows:
‘cam’ - requests the camera data
‘dist’ - requests the ultrasonic distance data
‘Sound’ - requests the sound data from the mic
Server
About
This component is a node.js application which uses websockets to establish a web socket server for clients to connect to.
Clinet
About
This component is a Unity application but can be any language that uses web sockets. It uses websockets to establish a web socket client to send custom messages and then receive sensor data.
Hardware
The hardware and sensors component of this is modular and is based on the the following:
- 1 x Raspberry pi 4 (include)
- 1 x Raspberry Pi camera either normal or for low light the NOIR (included with spares and variations)
- 1 x Ultrasonic sensor HC-SR04 (2-400cm) or HC-SR5 (3-500cm) (included with spares and variations)
- 1 x usb microphone (or usb audio interface) (included with spares and variations)
Other hardware requirements include:
- Raspberry Pi camera Housing (included)
- Wall mount
- USB-C Power supply
- WiFi router and connection
- GPIO out for testing (included)
- Heat sicks
- Micro SD card
Raspberry Pi Setup Requirements
This project assumes you have the following skills:
- Raspberry Pi - Raspberry pi or linux experience
- Electronics - read schematics and understand electronic foundations
- Network skills - SHH and connectivity required
- Git - clone and ssh keygen required
Build and Setup Instructions
- Install latest Rasperian on a 32Gb SD card
- Run typical Raspberry Pi Setup with
$ sudo raspy-config - Set new Password
- Set hostname
- Expand drive
- Set root mode to password command line (not desktop)
- Set WiFi
- Enable SSH
- Copy SSH keys if you need to dev on the Pi
- Enable camera
- Run Raspberry Pi Upgrade
- Run apt-get update
- Install Git
- Install Node
- Clone tt socket-server repo
- run
$ npm install - Connect hardware
- R Pi NOIR Camera and setup then test it works
- Mic and setup then test it works
- Ultrasonic sensor and setup then test it works
- Set up node socket service as a Daemon, PM2 or node Forever
- Install Forever globally with
sudo npm install forever -g - Edit `sudo nano /etc/rc.local` and add a line before exit eg
‘sudo node /home/pi/apps/sensors/socket-server/index.js’ - Reboot machine with reboot -now
- The Pi will restart and launch the socket server with the sensor services running ready to be connected via websocket
Websocket Methods
The websocket runs on port 8080 so to connect you need to use the address or host and port number. For example.
ws://raspberry:8080
Then from the artwork app, Unity for example connect to the socket and send a message then hangel the response.
Send Sensor Message in C#
ws.Send("cam"); // get camera sensor data
ws.Send("audio"); // get sound sensor data
ws.Send("dist"); // get distance sensor data
AUDIO_EVENT = "AUDIO_EVENT",
DISTANCE_EVENT = "DISTANCE_EVENT",
CAMERA_EVENT = "CAMERA_EVENT";
Receive Sensor Message in C#
ws = new WebSocket(ip);
ws.Connect();
ws.OnMessage += (sender, e) =>
{
if (e.Data != null){
string targ = ((WebSocket)sender).Url.ToString();
Debug.Log("Message Received from " + targ + ", Data : " + e.Data);
WSMessage jsonData = JsonConvert.DeserializeObject<WSMessage>(e.Data);
if (jsonData.Message == WsEventsType.AUDIO_EVENT){
Debug.Log("[WS - AUDIO_EVENT] Message Received from " + targ + ", Data : " + jsonData.Data);
DataManager.instance.AddSoundSpectrum(jsonData.Audio, targ);
}
}
Sensors
Ultra Sonic Distance Sensor
About
There ware a few options for the sensor and testing in the actual site needed to be done to determined the best suited sensor. However, this POC was created with the most common and tough sensor from the robotics industry called the HC-SR04 and HC-SR05.
Configuration
See Pi setup instructions
Code
The main class for the ultrasonic sensor can be found in the socket-server directory and is called getDistance. This is called from inde.js and props are parsed through such as:
Id: sensor id for differentiation
Interval: int for update frequency
Trigger: pin for trigger
Echo: pin for echo
There are javascript and python scripts to help test the sensor is connected and working if the sensor output data received is null.
Microphone
About
This sensor is a basic USB mic which is very simple to setup (see above). It pulls a frequency spectrum from the mic and parses it rhoguth.
Configuration
See Pi setup instructions
Code
The main class for the mic can be found in the socket-server directory and is called audioSensor.js. This is called from index.js and props are parsed through such as
Id: sensor id for diferentiation
Interval: int for update frequency
NOTE: There are javascript and python scripts to help test the sensor is connected and working if the sensor output data received is null.
Camera
About
This sensor is a standard Raspberry Pic camera which is very simple to setup (see above). It pulls the current frame camera and parses it through as a Buffer object.
Configuration
See Pi setup instructions
Code
The main class for the camera can be found in the socket-server directory and is called videoSensor.js. This is called from index.js and props are parsed through such as
Id: sensor id for differentiation
Interval: int for update frequency
Width: int for height pixels
Height: int for height pixels
Fps: int for framerate
Encoding: JPG, PNG GIF or BMP
Quality: int out of 10
NOTE: There are javascript and python scripts to help test the sensor is connected and working if the sensor output data received is null.
Reference Images
NFT Art Software
Unity
The unity component of this project has three main parts, the web socket, the data manager and the artwork. As the artwork style and treatment was TBD this part has been left out for simplicity.
Software Images
Above: Screen shot of socket server and client. Left screen is the ssh output of raspberry pi socket server printing the sensor data and sending it to unity. Right is the Unity socket client receiving the web socket data.
Supporting Documents
Recommended Hardware list - may change based on final execution and approach