I had to do some SMTP relay troubleshooting and it wasn’t obvious how to view the raw SMTP message in Outlook. So here’s how:
In the message window, choose File/Properties and look at the “Internet headers” at the bottom of the Properties dialog.
I am using Office Professional Plus 2016. If you have a different version of Outlook such as Outlook 2010 or Outlook 2013, the steps may be different for your version.
I spent a good part of the weekend on a re-design of a client’s dental practice site. Like probably 95% of dentist/doctor practices’ sites out there, this site runs on WordPress. Here’s the updated design:
Here’s the mobile version:
I didn’t have time to find/get better photos for the mobile dentistry service at the bottom of the page, so those photos will have to do for now.
So, here are some tips for how to quickly make changes to an existing WordPress site:
Learn the Theme
Familiarize yourself with the WordPress theme in use. Every theme works a little differently. The theme for this site exposes a lot of the options via its Theme Options menu, but not all theme customizations can be done there. Some of the options must be done in WordPress’s built-in Appearance/Customize menu.
Most commercial themes also come with fairly extensive documentation. Give the docs a read-over at least once.
Learn the Plug-ins
Many sites / themes make heavy use of plug-ins. For example, this dentist office’s site uses Slider Evolution to show a slider on the home page. Slider Evolution itself is fairly deep application with lots of options.
Use the String Locator Plug-in
The String Location plug-in is great for when you see some element on a page that you need to change, but have no idea how it gets there. This plug-in lets you easily search through your themes, plugins, or even WordPress core to find files containing the text.
Track Your WordPress Site in Git
Adding the site’s files to a git repo allow you to track changes you make to the theme’s files (style.css, header.php, etc.) and easily revert or identify exactly what you changed.
You should configure your .gitignore to ignore at least the following folders/files:
uploads
/wp-content/plugins/
/wp-content/mu-plugins/
cache/
backups/
Make Frequent Site & MySql Backups
Set up automated backups of the site’s files and database, so that in the case a disaster happens, you have something to go back to.
Here’s the script I use to back up my files and database:
Override CSS with Appearance/Customize/Additional CSS
With WordPress themes, the “Additional CSS” feature can be used to customize CSS to your liking. I find that the best way to use this is in conjunction with Chrome Dev Tools. I would try to get a relevant class for the element I need to tweak, and customize its CSS. A lot of times you need to add “!important” for your override to take effect. A really cool thing about the Additional CSS feature is that any CSS change takes effect right away in the preview pane.
A while back I wrote about sending data from SmartThings and other home devices data to Splunk so I can monitor what goes on in my home via Splunk Dashboards. In addition to SmartThings devices, I also pulled data from other data sources such as network routers, Windows event logs, weather data retrieval scripts, etc.
To monitor our Internet bandwidth usage I wrote a Node.js program to scrape the data from the admin web UI for my Verizon Actiontec MI424WR router. Here‘s the code for that.
Last week I upgraded my internet to Verizon Fios Gigabit and with that upgrade, the Actiontec router was replaced with another router: a Netgear R7000 running Advanced Tomato (open source Linux-based firmware for Broadcom based Wi-fi routers). Advanced Tomato has a pretty click interface to monitor bandwidth, but I still want the data in my Splunk instance.
Luckily, Advanced Tomato runs a variant of Linux, so all I needed was a shell script to calculate bandwidth usage data and send to Splunk via the Splunk Http Event Collector.
I found a script by WaLLy3K that already had the bandwidth calculation logic and all I had to add was a little more code to send the data to Splunk.
Step-by-step Instructions
Enable JFFS Partition on Your Router
Enable JFFS Partition on your router so that you have permanent storage for your script. Otherwise if you saved your script in /tmp, it’ll be gone after the next reboot. Log into your router’s admin UI, choose Administration/JFFS, select Enabled and Save.
Create Your Script
SSH into your router and create a shell script at /jffs/bandwidth.sh with the content from here. Update the splunkUrl variable with your Splunk HEC URL. If you are not able to SSH, make sure you have SSH Daemon enabled under Administration/Admin Access.
For more info on installing Splunk HTTP Event Collection, see my previous post.
# this is just an excerpt of the code. For full code see
# https://github.com/chinhdo/shell-scripts/blob/master/sh/bandwidth.sh
...
wan_iface=`nvram get wan_iface`
calc(){ awk "BEGIN { print $*}"; } # Calculate floating point arithmetic using AWK instead of BC
checkWAN () {
[ -z $1 ] && sec="1" || sec="$1"
netdev=`grep "$wan_iface" /proc/net/dev`
pRX=$(echo $netdev | cut -d' ' -f2)
pTX=$(echo $netdev | cut -d' ' -f10)
sleep $sec
netdev=`grep "$wan_iface" /proc/net/dev`
cRX=$(echo $netdev | cut -d' ' -f2)
cTX=$(echo $netdev | cut -d' ' -f10)
[ $cRX \< $pRX ] && getRX=`calc "$cRX + (0xFFFFFFFF - $pRX)"` || getRX=`calc "($cRX - $pRX)"`
[ $cTX \< $pTX ] && getTX=`calc "$cTX + (0xFFFFFFFF - $pTX)"` || getTX=`calc "($cTX - $pTX)"`
dlBytes=$(($getRX/$sec)); ulBytes=$(($getTX/$sec))
[ $dlBytes -le "12000" -a $ulBytes -le "4000" ] && wanStatus="idle" || wanStatus="busy"
getDLKbit=$(printf "%.0f\n" `calc $dlBytes*0.008`); getULKbit=$(printf "%.0f\n" `calc $ulBytes*0.008`)
getDLMbit=$(printf "%.2f\n" `calc $dlBytes*0.000008`); getULMbit=$(printf "%.2f\n" `calc $ulBytes*0.000008`)
}
Create another shell script /jffs/bandwidth-env.sh with the following content:
To test your script run it manually and confirm the data is showing in Splunk:
/jffs/bandwidth-env.sh
Schedule Your Script
To schedule your script, you can use the Scheduler (Administration/Schedule) in the router’s web admin UI. I have an automatic reboot scheduled at 4 AM, so I scheduled a custom script at 4:15 AM to run the bandwidth-env.sh script:
To start the script right away, spawn a process for it:
/jffs/bandwidth-env.sh &
Additional Info
Here’s a little bit of info on how the script works. The raw bandwidth data is read from /proc/net/dev.
Per redhad.com, /proc/net/dev "Lists the various network devices configured on the system, complete with transmit and receive statistics. This file displays the number of bytes each interface has sent and received, the number of packets inbound and outbound, the number of errors seen, the number of packets dropped, and more.”
For our purpose, we are interested in the first column which contains the cumulative number of bytes received by the interface, and the 10th column, which contains the number of bytes sent.
The script retrieves the current data, then sleeps for a number of seconds, and reads the updated data. The download/upload Mbit/s data is calculated by taking the difference and divide by the time elapsed. There’s also some logic to handle when the counters wrap around the max value back to zero.
Here’s how the data shows up in my Splunk Home dashboard:
π Imagine: merging a Pull Request is all it takes to automatically deploy your static or single-page app to a secure, dynamically scaled, and globally distributed network with integrated API support – that’s the promise of Azure Static Web Apps.
Announced during Microsoft Build 2020, Azure Static Web App is a service that automatically builds and deploys static web apps to Azure from a GitHub repository. The features that I find the most interesting are:
First-party Github integration
Globally distributed
Free, auto-renewed SSL certificates
Integrated API support by Azure Functions
For now – Free hosting for your static site (Angular, React, etc)
I gave it a try and I have to say: it’s pretty cool! π I’ll have the step-by-step on how to configure and deploy an existing site to Azure Static Web Apps below. The steps seem lengthy but there are really just a few basic steps:
Create a new “Static Web App” resource and configure basic parameters.
Point to the GitHub repository of your app.
After a few screens, your app is automatically deployed and available on its own secure URL on Azure Static Web Apps.
If you want to create a brand new site from scratch, see the official documentation docs.microsoft.com.
Deploying an Existing Static Site to Azure Static Web Apps
As part of the #100DaysOfCode challenge, I’ve been working on a React site that hosts various programming utilities like encoders/decoders, UUID generator, test data generators, etc. It’s a perfect candidate to test out Azure Static Web Apps.
Create Your Static Web App
First, log into Azure Portal and click “Create a resource“, then search for “Azure static web“. You should see Static Web App (Preview) in the search results. Click on it.
Click Create.
Fill out the Basics tab. Most of the fields are self-explanatory. Click “Sign in with GitHub“.
The page expands, showing a few more fields for GitHub. Fill out with the info for your app’s GitHub repository, and choose “Next: Build>“.
On the Build tab, fill in the appropriate values for “App location“, “Api location“, and “App artifact location“. Then click “Review + create“.
“App location” is the root folder for your app. It’s typically / or /app. “Api location” should be left blank if you are unsure. “App artifact location” is the folder to your build.
Review your settings and click Create. Wait a few seconds for the deployment to complete. During the initial deployment, Static Web Apps automatically creates a GitHub Action for you (in the file named azure-static-web-apps-<id>.yml and adds it to your chosen branch. When it’s all done you should see this page:
Click “Go to resource” to go to the resource page for your new Static Web App. On the Overview page, you will see a link to your web app.
Click on “GitHub Action runs” to go over to your GitHub repo and view the status of your deploy Action. You should see a new Action named “ci: add Azure Static Web Apps workflow file”. It should take about two minutes to run.
Switch to the Code tab and see that a new Action file was added to your repo. This is what tells GitHub to automatically build and deploy your app to Azure Static Web Apps.
Bring Up Your Static Web App
It’s time to bring up our web site on Azure Static Web Apps!
Go back to Azure Portal and click on the URL to your app to bring it up. The below screenshot shows my app now running on Azure Static Web Apps with its own unique secure URL.
Custom Domain Name
Adding a custom domain name is pretty straightforward. On Azure Portal, go to the home page of your Static Web App, and click on “Custom domains“, then Add.
On the “Add custom domain” page, note the Azure host name for your site. You will need to create a DNS CNAME record to point your custom domain to that Azure host name. The specifics of this part depends on your domain registrar or web hosting provider. For me, the domain name chinhdo.com is hosted on Pair Networks so I went to their control panel and created there CNAME record there.
After you have created your DNS CNAME record, go back to Azure Portal and paste your custom domain into the “Custom domain” box, then click Validate.
Depending on your DNS settings it can take up to 48 hours for a new DNS record to propagate. In practice it should not take more than an hour. In my case it took about 15 minutes. If you get an error in the next step, just wait some time and try again.
After you get the “Validation succeeded” message, click Add to add your custom domain to your Static Web App. This step took about one minute for me
To handle server routes, you need a routes.json file to your build folder. In my React app, I added it to the /public folder.
Server routing is required to handling “hard” navigation to routes that are handled by your single page app. In my case, I have a React route for /uuid which works fine when you navigate there within the app. But you will get a 404 if you go there directly, or do a browser refresh while you are on that page. Server routes take care of that. See the official docs for more info.
I did run into one problem with a test site. The initial deployment for it did not happen automatically. I verified that the *.yml file was created but the Action never got executed. I was able to get around that by pushing a “dummy” change to the repository.
Game Changer?
Azure Static Web Apps looks to be a game changer. You are getting important features like SSL, dynamic scaling, global distribution, and GitHub deployment all in one easy-to-use package. Once configured, deploying your changes is as simple as pushing code to your GitHub repository. In fact I have deployed several new versions of the app over the past few days and it worked perfectly fine each time.
If cost is reasonable, I would definitely use it on a permanent basis for my single-page/static sites. Currently Azure Static Web Apps is in preview and is free, but things may change after it goes out of preview.
What do you think about Azure Static Web Apps? I would love to hear your thoughts. Let me know here in the Comments section or on Twitter!
The smart home has gone through quite a convergence in the last few years. Modern protocols like Z-Wave & ZigBee, along with mart hubs, and smart assistants like Amazon Alexa, Google Home & Apple Siri are finally bringing everything together to make the smart home a practical and reliable reality.
What had been still missing from the picture for me, is the ability to log, analyze, and visualize all the data that my smart home generated. I use Splunk (data capture and visualization tool) at work so I decided to give it a try at home and it’s worked out great.
Here’s a Splunk dashboard I created for my home, showing current and historical data from multiple data sources: energy meter, contact sensors, switches, weather data feed, Windows event logs, and some custom PowerShell scripts.
My SmartThings-based smart home setup:
Samsung SmartThings Hub 2nd Gen
Amazon Echo Devices
Various ZigBee/Z-Wave devices
Samsung SmartThings GP-U999SJVLAAA Door & Window Multipurpose Sensors
PowerShell scripts to pull data from openweathermap.org & run/log periodic Internet speed tests.
Splunk Free
Installing Splunk Free Edition
Download and install Splunk. You will start with the Enterprise version which comes with a 60-Day Trial. After that you can switch to the Free edition. Splunk Free allows indexing up to 500 MB of data per day which has been sufficient for my home logging needs. For my setup I installed Splunk on a 14-year old Windows box with a Intel Core2 Quad CPU Q6600 @2.40GHz – Splunk indexing/query performance has been pretty acceptable.
If your install was successful, you should be able to log into Splunk web by navigating to http://localhost:8000 (or replace localhost with your Splunk server hostname).
If you want to monitor other computers, install Splunk Universal Forwarder on each of those computers. I’ll go through how to configure the Universal Forwarders in a future post.