At the UKOUG Tech17 conference last year, I delivered a presentation called APEX and the Internet of Things. As part of this presentation, I wanted to demonstrate how to query the database and navigate an APEX application using voice commands. If you were there and visited our stand then you might have also seen our registration application, which allowed users to search for their name and then register themselves to win a prize.
We have been looking in to real world use cases for data retrieved using voice commands, and we have seen that there is room for some useful applications in places where hands-on work might mean that you don’t always have access to a device to view your application. For example, a server room could contain an Amazon Echo that responds with various information about the servers and environments hosted on them when asked.
This blog will show you how to get up and running with this and create an Alexa skill that queries your Oracle database.
There are three main building blocks needed to get this working in its simplest form:
- ORDS Web Service (configured either in APEX or on the database).
- AWS Lambda function (anonymous function).
- Alexa Skill.
These building blocks fit together something like this:
I’ve compiled some step-by step instructions to help you create the three different elements and fit them together. You can find the instructions in the following application:
The example includes a simple web service I created to count the number of orders in a table, but you can replace this with anything you like. The query in your web service can be as complicated as you like – Itis here that you will be able to use your existing Oracle knowledge to create Alexa skills that are meaningful to you based on your own data.
Keep checking the blog for part two where I will demonstrate how to interact with an APEX application UI using Alexa voice commands.