Thursday, 3 October 2024

Automating a Live Stream with Home Assistant Based on Sun Elevation

Up until just recently, I have been semi-manually starting and stopping the live stream.
Before I log off for the evening, I would calculate the number of seconds between that time and dawn then use that to create this command:

sleep <seconds> && sh script.sh

I would monitor the stream throughout the day (for mod actions like bot whack-a-mole and to count as an extra viewer :) )

Just after sunset when the light had dimmed, I would then attach to the session and terminate the script.

I found that sometimes I would either forget to do the first command, run the command from the wrong path, or miscalculate the duration.


With this in mind, I worked on an automation project using Home Assistant to control the live stream based on the sun's position. The goal was to automate the start and stop of the stream depending on the sun's elevation.

The Plan

I wanted the stream to run from just before sunrise to just after sunset. To achieve this, I decided to start the stream when the sun’s elevation reaches -8.0 degrees (during dawn) and stop it again when the sun’s elevation drops to -8.0 degrees after sunset (during dusk). This would mean the stream ran as the sun travelled across the sky, creating a seamless integration between the time of day and the stream.

Step 1: Controlling the Stream Remotely via SSH

The stream is controlled by two scripts on a remote host, which I can access over SSH. These scripts use tmux to ensure the stream continues running even if the connection to the remote host is lost.

  • Start Script: Checks for the existence of a tmux session. If the session doesn’t exist, it creates one. It then sends the command to the session to start the stream script.
  • Stop Script: Sends a command to the tmux session to stop the stream.

I set up Home Assistant to SSH into the host and trigger these scripts.

Step 2: Creating a new switch and adding SSH Commands in Home Assistant

To start with, I had to do the normal key-pair exchange on the home assistant command line, which creates the key file. In order for home assistant to use those keys, you need to copy them to the config folder.

To trigger these scripts, I created two command_line services in Home Assistant's configuration.yaml file:

yaml
command_line:
- switch: name: Pi_Stream command_on: "ssh -q -i /config/id_rsa -o UserKnownHostsFile=/config/known_hosts user@host sh start.sh" command_off: "ssh -q -i /config/id_rsa -o UserKnownHostsFile=/config/known_hosts user@host sh stop.sh"
  • The command_on: command connects to the remote host, runs the start script, which checks for the tmux session, and starts the stream.
  • The command_off: command connects to the remote host, runs the stop script, which sends the terminate script command to the session, halting the stream.

Step 3: Automating Based on Sun Elevation

The next step was to automate the stream using the sun's elevation. Home Assistant's built-in sun integration tracks the position of the sun, which can be used to trigger automations.

I created an automation that stops the stream when the sun's elevation reaches -8.0 degrees after sunset:



yaml
- id: '1727953002868'
alias: Stop Pi Stream description: When its dark triggers: - trigger: numeric_state entity_id: - sun.sun for: hours: 0 minutes: 1 seconds: 0 attribute: elevation below: -8 conditions: [] actions: - action: switch.turn_off metadata: {} data: {} target: entity_id: switch.pi_stream mode: single

Once this automation is triggered, the stream shuts down as dusk deepens. As the stream is currently running at the time of writing, I need to wait to see if this is successful.
Once this is confirmed to be working, I plan to create another automation to start the stream when the sun's elevation rises to -8.0 degrees in the morning.

My next task is to reconfigure how the chat bot works.
Currently, the bot runs from a script, which is running hourly as a cron job.
The plan is to take this out of cron and utilise the SSH function to get home assistant to fire off the script hourly, but only if the stream is running.
This way its not running all the time.

Summary

By integrating Home Assistant with SSH and tmux, I was able to fully automate the control of a video stream based on the sun's position (subject to successful testing in about 4.5 hours according to the MET office). This approach can be adapted to any scenario where a remote script needs to be triggered from Home Assistant. It opens the door to many possibilities, whether for controlling cameras, live streams, or other devices based on environmental factors like light and time of day.

If you're looking to combine the power of Home Assistant's automation with remote scripts, SSH and tmux are excellent tools to ensure your commands run reliably.

Wednesday, 2 October 2024

Automating Twitch Announcements Using Cron and Home Assistant

 

Recently, I integrated Twitch's API to automate sending announcements to my channel's chat. The goal was to set up a system where an hourly log, updated by Home Assistant's Sun integration, triggers a script that sends the last log entry as an announcement in my Twitch channel chat.

Setting Up the Twitch API

To begin, I created a Twitch application, which provided me with a Client ID and Client Secret. These are essential for making authenticated requests to the Twitch API. After creating the app, I needed to gain an Access Token with the appropriate scopes that would allow me to post chat messages.

Initially, I ran into an issue with the requested scope. After consulting Twitch's documentation, I learned that the correct scope for managing announcements had changed from channel:manage:announcements to moderator:manage:announcements. However, after further consideration, I decided to use the user:write:chat and user:bot scopes to simplify the integration.

Getting the Access Token

Using the Twitch OAuth flow, I generated an authorisation URL that included the necessary scopes. Once authorised, I exchanged the authorisation code for an access token using a simple curl command.

The access token allowed my script to communicate with Twitch’s API. Additionally, I retrieved my Broadcaster ID and Sender ID—important parameters for sending chat messages.

Automating Announcements

With the access token and IDs in hand, I wrote a bash script that:

  • Reads the latest log entry from a file (SunElevation.txt), which Home Assistant updates hourly.
  • Sends that entry as an announcement to my Twitch chat using Twitch’s chat API.

I configured the script to run hourly via cron, ensuring my channel stays updated with automated messages based on the Sun elevation data collected by Home Assistant.

Overcoming Common Issues

Throughout the setup, I encountered a few key hurdles:

  1. Authorisation Scope Mismatch: Initially, the scope for sending announcements was incorrect, but switching to user:write:chat and user:bot solved the issue.
  2. OAuth Flow and Redirect URI: I manually managed the OAuth flow, copying the authorisation code from the browser and exchanging it for the access token via the command line. Though the process works, I’ll explore automating this step in the future.
  3. Cron Job Automation: The final piece was setting up cron to run the announcement script hourly. With the SunElevation.txt being updated regularly, this ensures the announcements are always in sync with the current state of the sun.

Conclusion

This setup provides a seamless way to automate Twitch announcements based on data from Home Assistant. The ability to send messages via the Twitch API opens up countless possibilities for engaging viewers in a dynamic, automated way. Whether it’s updating viewers on the weather, system statuses, or other key data points, this method can be easily adapted to suit various needs.

Shout out to my good friend, best mod and overall tech wizard Peaeyeennkay, for helping me navigate the quagmire that is development documentation.

Please go check him out over at https://mastodon.social/@PeaEyeEnnKay

Stay tuned as I continue to refine and enhance this setup!

Monday, 23 September 2024

Streaming Setup: Integrating FFmpeg Overlays and Audio into a Picam feed

Lately, I’ve been setting up and refining a Raspberry Pi-based streaming setup, focusing on combining a video feed from a Raspberry Pi camera with overlay graphics and audio in real-time using ffmpeg. It’s been quite a journey, filled with trial and error as I worked through various technical challenges.

TL:DR Take me to:
The Twitch

I stumbled upon Restreamer (https://github.com/datarhei/restreamer) which runs in a container.
I deployed this to the Raspberry Pi and set about connecting everything up.

Initial Camera and Overlay Setup

I started by streaming a camera feed using rpicam-vid on a Raspberry Pi. The initial command streamed video at 1080p and 30 fps to a TCP connection:

rpicam-vid -t 0 --inline --listen -o tcp://0.0.0.0:8554 --level 4.2 --framerate 30 --width 1920 --height 1080 --denoise cdn_off -b 8000000

I was suitably able to add this to the restreamer software, add a secondary audio stream, connect it to a Twitch account and stream live.
Unfortunately the software has no mechanism for adding overlays to the resultant stream.

With this in mind I created another ffmpeg command that takes the TCP stream from the stream from the Pi, overlaid an image and added the contents of a text file mentioned above.

ffmpeg -loglevel debug -i tcp://192.168.1.54:8554 -i StreamOverlay.png \ -filter_complex "[0:v][1:v]overlay=0:0,drawtext=textfile='current_ track.txt':x=(w-text_w)/2:y=h-50:fontcolor=green:fontsize=24:box=1:boxcolor=black@0.5:boxborderw=10" -an -c:v libx264 -f mpegts tcp://<ip_address>:8556

It seems the Raspberry Pi 4 doesn't have sufficient resources to encode the camera feed with the overlay. I tried to reduce the incoming camera resolution to 1280 * 720, but this was still insufficient for the restreamer software to handle on the modest hardware. At this point I moved the heavy lifting over to a virtual machine on my home server and this seemed to solve the problem.

ffmpeg -loglevel debug -i tcp://0.0.0.0. -i StreamOverlay.png \-filter_complex "[0:v][1:v]overlay=0:0,drawtext=textfile='current_track.txt' :x=(w-text_w)/2:y=h-50:fontcolor=green:fontsize=24:box=1:boxcolor=black@0.5:boxborderw=10" \   -an -c:v h264 -b:v 8M -g 30 -preset veryfast -tune zerolatency -bufsize 16M -max_delay 500000 \   -x264-params keyint=30:min-keyint=15:scenecut=0 -f mpegts tcp://0.0.0.0:8554?listen

Initially, I encountered stream quality and decoding errors.
After tweaking buffer sizes, bitrate, and keyframe intervals, things began to stabilise.

Integrating Audio

Next, I focused on integrating audio into the video stream. Initially, I used a separate ffmpeg process to stream MP3 files over TCP, but I faced an issue where audio stopped after the first track ended. The ffmpeg process didn’t crash but would stall on subsequent tracks. Here’s the basic script I used:

#!/bin/bash
audio_folder="<folder where music resides>"
output_file="current_track.txt"
while true; do
  for file in "$audio_folder"/*.mp3; do
    echo "Now playing: $(basename "$file")" > "$output_file"
    cp $output_file /home/rob/$output_file
    ffmpeg -re -i "$file" -acodec copy -f mulaw tcp://0.0.0.0:8555?listen
  done
done

After switching to a local setup, with both the video and audio on the same server, I modified the overlay command to iterate through the MP3s in a folder directly.

Putting it all together


I moved the individual commands to their respective scripts and added some logic that would restart the "service" if it dropped for any reason:

It seems that the restreamer software doesn't like being on the Pi, with this in mind I bypassed that extra software entirely.

That worked, but I still had issues with audio.

#!/bin/bash

# Define the folder containing the audio files
audio_folder="/home/rob/Music"

# Define the text file where the current track info will be written
output_file="current_track.txt"

# Define the playlist file
playlist_file="playlist.txt"

while true; do
    # Generate the playlist file
    rm -f "$playlist_file"
    for file in "$audio_folder"/*.mp3; do
        echo "file '$file'" >> "$playlist_file"
    done

    # Get the first track name to display as "Now playing"
    first_track=$(basename "$(head -n 1 "$playlist_file" | sed "s/file '//g" | sed "s/'//g")")
    echo "Now playing: $first_track" > "$output_file"

    # Run ffmpeg to combine the video, overlay, and audio from the playlist
    echo "Starting ffmpeg overlay with playlist..."
    ffmpeg -loglevel level+debug -i tcp://192.168.1.54:8554 \
            -i StreamOverlay.png \
            -f concat -safe 0 -i "$playlist_file" \
            -filter_complex "[0:v][1:v]overlay=0:0,drawtext=textfile='$output_file':x=(w-text_w)/2:y=h-50:fontcolor=green:fontsize=24:box=1:boxcolor=black@0.5:boxborderw=10" \
            -c:a aac -ac 2 -b:a 128k \
            -c:v h264 -b:v 6000k -g 60 -preset veryfast -tune zerolatency \
            -bufsize 12M -max_delay 500000 -x264-params keyint=60:scenecut=0 \
            -f flv rtmp://live.twitch.tv/app/live_<stream_key>

    # Check if ffmpeg encountered an error and restart
    if [ $? -ne 0 ]; then
        echo "ffmpeg stopped. Restarting in 5 seconds..."
        sleep 5
    fi
done

This seemed to work fine for a time but then the audio would stop. I am yet to find the time to investigate.

Tidying up


I had the various scripts running in separate tmux sessions for my visibility. To make this easier, I made a script that creates the sessions and runs the respective script

#!/bin/bash

# Define script paths
camera_script="/path/to/your/camera_script.sh"
overlay_script="/path/to/your/overlay_script.sh"

# Define session names
overlay_script_session="Overlay"
camera_session="Camera"

# Start tmux session for Camera
tmux new-session -d -s "$camera_session" "bash $camera_script"
echo "Started tmux session: $camera_session"

# Start tmux session for Overlay
tmux new-session -d -s "$overlay_script_session" "bash $overlay_script"
echo "Started tmux session: $overlay_script_session"

This works great if I have to restart everything.
I'm also looking in to a way of automating the start and stop of streams based on the sunrise and sunset in my location, but for the time being I am just calculating the time in seconds between now and sunrise and adding that to the command in one line:

sleep <seconds> && sh script.sh

Timelapse Creation

During all of this, I also worked on creating a timelapse from the resultant 13-hour  off video. Using ffmpeg, I generated a 1-minute timelapse that was successfully uploaded to YouTube. The command was straightforward and effective:
ffmpeg -i input_video.mp4 -filter:v "setpts=PTS/802" -an -r 30 output_timelapse.mp4
This command sped up the video by a factor of 802 times by adjusting the presentation timestamps, producing a smooth timelapse.

Final Thoughts

This project has been a learning experience in stream handling, ffmpeg configurations, and overcoming hardware limitations. I’ve moved most of the intensive processing off the Raspberry Pi to ensure smoother streaming and a better viewer experience.
Man, formatting ffmpeg commands correctly, especially for taking multiple sources and overlaying them in the way I wanted.
While there are always more optimisations to be made, especially regarding audio stability, the progress has been rewarding. 

You can find:
The Twitch

Sunday, 22 October 2023

Add a Twitch randomised fact command

 

I recently had a conversation with a friend on stream about creating a twitch chat command that would put a random fact in to the chat.
After some poking around with Streamer.bot I came up with this.
This may not be the most efficient method for doing this, but it was something I was able to set up in a few minutes.

You essentially have a number of actions which are your facts being entered into chat and another action, which is triggered by a twitch !command, that gets a random number between 1 and the amount of facts you have and selects the corresponding action.

I thought I'd leave it here in case anyone finds a use for it.
Do you have an idea for streamer.bot that you want to turn into a reality, drop a comment and let me know!
If you haven't already, I'd suggest you also check out their discord, there's some very friendly and clever people over there. https://discord.gg/VmdKdmVya2

Add a randomised Foxy Facts chat command


go to actions tab

right-click in action list

click add



Give it a name

and you might want to add them to a group for ease of viewing later

click ok



For that action go right-click on sub-actions on the right

hover over "Twitch" and "chat"

click send message to channel



enter the fact

click ok



do that for all the quotes you have


right-click and add another action

do the same as before (add a name like “Foxy Facts”, group, click ok)

right-click in sub-action, hover over "core", "logic", click "Get Random Number"




Enter from “1” and the number of facts you have



Click OK


right click in sub-action again, hover over "core", "logic", click "If/Else"



enter the variable as "randomNumber"

enter the value as 1

click "do action" (the button will say <No Action Selected>)



scroll and select the first fact action



we need to add a sub-action for each fact we have

so, we right-click "core", "logic", then "If/Else"

enter the variable the same "randomNumber"

enter the value of 2 and so on

click "do action" (the button will say <No Action Selected>)


Once we've done all that, we need to set a trigger

click on the “Commands” tab

right-click on the list and click “Add”



Give it a name

give it a command like “!foxyfact”



if you want to set a cool-down, you can do that on the bottom right


click OK


Go back to “Actions” and click on “Foxy Facts”


right-click in “Triggers”

Hover over “Core”, “Commands” and “Command Triggered”



scroll and select the second fact action and so on



Now, whenever someone puts “!foxyfacts” (Or whatever you called it)

It will put a random foxy fact in chat.

Saturday, 13 May 2023

A falling out, A genius idea and an idiot moment.

The modem and the router had an argument in the early hours of Thursday night.

They are no longer on speaking terms, so that's why the network has  been down.

Regardless of my attempts of mediation, the two remain inconsolable.


I thought about purchasing a replacement, but I didn't really want to spend £100-200 on a new Draytek.

That's when I came up with the genius idea of making my own router, with black jack and hookers.


I started looking up OPNsense virtual appliance installation and configuration.

Thinking about the physical aspect, cables and such, my first priority was getting the internet over to the server. On the opposite side of the house. 

The current solution was trunking from the master socket up to the office, where the Draytek router resided. From there essentially a patch lead ran from that, through the loft, down into the opposite bedroom, which is my wife's art studio.

Inside the cupboard, aside from some jackets and art supplies, reside my main server, Which provide hypervisor and file services, along with my backup server (which is not entirely live) and a unifi access point.

The first thing I thought of was a spare mikrotik router.

That's when it hit me. You idiot, just use that.


Cut to three hours later and I have it configured to give my pc an IP address and also take an IP address from the crappy nowTV modem/router.

I would have rather preserved my IP reservations, but I decided I didn't have the energy to connect back to the Draytek to note down all the reservations manually, so I plugged in the rest of the network.

I added a few NAT port forwarding rules in place, but at the time of writing, that traffic is still being blocked somewhere upstream, so will require further investigation.


At least now the network has internet access again, so my wife can stream.

I still need to get port forwarding working so that I can start using "private cloud services" and soon my Minecraft mod can get back in the server when they want to.

I'm hoping as the LAN range is the same, the existing client leases will be honoured from a new DHCP server and get given the same IP. That being the case setting the reservations will be a lot easier.

Edit: They were.

Mikrotik's web GUI takes a bit of getting used, but it is starting to make sense. I suppose that's the same with any routers GUI.


Around the outside, I replaced my POE switch, this week.

I had a really old d-link web smart switch, it really wasn't happy. After the third time of waking up to find no network access, I decided to replace it.

I got a tiny 5 port Tenda unmanaged POE switch. Its smaller, quieter and cheaper to run. Yes, I lost VLANning, but the only reason I wanted that isn't a requirement any more.

It arrived and I realised my mistake. This device was POE powered. I hastened to start the returns process, but then remembered that I still had an old POE injector. 

It was only 10/100.

Ok, back to Amazon, cancel the return and grab a gigabit injector.


What I was actually hoping to sort out this weekend, aside from mowing the lawn and doing laundry, was maybe getting cabling down to the living room, or getting a spare Cisco AP installed in the dining room.

This would have been to improve bandwidth for my raspberry pi, who was trying to talk to the unifi on the other side of the house. This resulted in a, let's say, changeable experience when streaming content.


That will have to be for future Rob.

Saturday, 20 August 2022

Website and affiliates

I finally got around to completing version 1.0 of my wife's website for her art and business.

I'm not a web developer or content manager, but we went over the requirements, the main ones were a portfolio gallery and an asset rotator on the front page.

It took a while to get to grips with working on this kind of thing again, going through several iterations based on different platforms before finally landing on WordPress.

If you want, you can check them out. I've added the hyperlinks.


Once this was complete and the "customer" was happy, I decided it was time I updated my own website.

For the longest time, it was just a static page with a blurb and contact information. Honestly, it was looking pretty amateur.

The intention as with most websites is to attempt to engage with customers. I always had the idea of driving traffic from places like Twitch, YouTube, Twitter etc to a main site that had all the information in one place and showcase some of my other hobbies and services. I don't really have a particular niche, but I think I come up with or at least stumble upon some interesting stuff on occasion.

I've gone with a generic “tech services” template for now and having removed a lot of the boiler-plate stuff, I think it looks ok. There's basically no information on it presently, besides the contact information, but at least it now links out to this blog. In time, my intention will be to customise the theme for more of a tech geek style, with my central green colour scheme running through it.


Back along, my Sister created a family WhatsApp group for easily communicating updates during a difficult time. She had posted a link to a shower head on Amazon and it ended up being quite popular within the family. I believe at last count, five of us have one. I joked that she should create an affiliate link for it so that she could get a small kick back.

I thought it might be fun to make a pseudo-shop, which showcases things that I've found or purchased that I think are cool or useful enough to show. I created an Amazon affiliate account that I could link out to, so if anyone ever thinks like me and wants to purchase an item, I'll be able to see. This would primarily be for the purpose of learning a bit about something new, I really don't expect to make any money from it.

From there, I also created an affiliate account with my chosen hosting provider Eco Hosting, after chatting with a colleague in the market for a domain. So, I've been toying with CNAMES and subdomains to try and make this a little more useable and professional looking.


Lastly on the website front, I had an idea in my head for some time now for a website that has a bunch of how-to guides about things I've learned while streaming on Twitch.

Things like how to do certain things in Open Broadcast Studio, setup cameras and microphones, add transitions and more custom elements like counters using Streamer.bot.

The ultimate goal would be to have written articles outlining the steps, with an accompanying YouTube video.

The sites working title is "Streaming is Hard"

If you're reading this and have any interest, do please leave a comment to say so and if there is anything particularity you wish to see on there.

There's not really a timeline for this as such. back along I went through my current OBS configuration and noted everything down, with the intention of one day ripping it all out and doing a stream where I set it all up from scratch.

This way, I can easily split it up in to clips to add to the articles. What would be really cool and time-saving would be if I could get some sort of step recorder that I could use to create the framework of the written article. I'll have to do some research, If I remember correctly the built-in windows step record is not quite as clever as all that.


That's about it for this one. I know I say this every couple of years, but I am, once again, going to try and update this blog more frequently and I have a number of new posts in the pipe.

I'm going to try and find a balance between frequency and energy expenditure, so that I don't burn out too quickly.

Keep an eye out for the next one which will be in a few days to a week. I'm not sure which one I will choose for the next instalment, most of them being of a technical nature, outlining what I've setup or done with a particular service or piece of software, but I think they have the bones of something interesting.

Take care of yourself and keep an eye on your spoons.

Saturday, 19 December 2020

Half an Oven, Windy Fence Panels and plenty of Mince Pies.

I got home from picking my wife up from work and the oven broke.

It tripped the breaker and then the main oven wouldn't heat up above 50 degrees.

We had a look online and found a suitable replacement.


The whether was pretty bad overnight, to the point where Max was a little uneasy,

but he eventually settled down.


In the morning, I came downstairs to make coffee and let Max out into the garden to find that

a fence post had broken at the base, meaning two panels had collapsed, due to the high winds.


This meant a trip to B&Q to get wood and tools I hadn't already got for the job .

It seems as though where the bushes had become over-grown, the root system had cracked some of the

concrete around the post, which in turn got wet, rotted away and eventually broke.


I tried getting as much of the wood out as possible, which turned out to be about 20 CM, so I gave that up.

The concrete is still solid and almost half a metre square, so I decided I wasn't getting that out anytime soon and even if I managed it, what the hell am I gonna do with a large lump of concrete?


My wife did her normal and searched online for local businesses that did fencing work.


I look down the list of companies and of course, nearly all of them use Gmail or outlook.com or Yahoo of all things.

I browse the websites to look and they're all the same bog standard designs with bad fonts and layouts and even spelling errors, which annoys the hell out of me.

Yes, you may not be able to set up a website yourself and maybe you cant afford to get someone in to do that for you, but the least you can do is proof your own damn copy.


I've been off work for two weeks now. 

Feeling a lot less stressed now that stuff has cleared out of my head and not having to remember certain manual tasks on a daily and weekly basis also makes for a lot clearer head space.

I sometimes catch myself wondering what's happening with those tasks. Before I eventually turned my work phone off, I still saw people emailing me directly with queries and such, so its reasonable to assume some of them are being missed.


I still have no idea what's going to happen when I do go back in January. 

I've been meaning to start writing an email to my boss including a list of things that I think need attention and that I feel would help me to be moved to someone else's remit. 

I've' generally been putting this off to avoid the resultant anxiety, but this has proven to create its own problem in that whenever I put my head down to sleep, my brain immediately tries to run through them all at the same time, so I should really start getting that list down for my own benefit.


Still, now is the time for rest and relaxation over this holiday period. Mince pies are a-go-go and some of the various cheeses in the fridge have been open, not to mention the large box of milk tray I received in the post has completely gone.