Posts

Showing posts from March, 2020

Configuring DNS - Web Servers with Subdomains

Image
So I first purchased the domains phub.info and slabj.com for testing purposes. I changed the DNS settings to direct them to servers I created on DigitalOcean. I am connecting phub.info & slabj.com to LAMP stacks with Wordpress. The setup includes 3 main parts. Configuring GoDaddy where I purchased the domains to use custom 'Nameservers' such as ns1.digitalocean.com, ns2.digitalocean.com & ns3.digitalocean.com.  Then I changed the A name record to @ 'hostname' to direct to the server I want to use and the CNAME of www to match the hostname of each site respectively so that people can visit my website by typing the site name or add a www to the front. The sites pointed fairly quickly but this process can take from 24-48 hours to take effect.  I am going to add a subdomain to slabj.com so that if you goto promotions.slabj.com you can see special promotional offers for the month. To add a subdomain within my Apache server configuration I goto /var/www where my

Create a LAMP(Linux, Apache, MySQL & PHP) stack with Wordpress

Image
In this example I create a LAMP stack and add Wordpress along with Node.js for additional customization later on. The LAMP stack includes Linux, Apache, MySQL and PHP. Once that is setup I loaded Wordpress which is equipped with PHP to run scripts and interact with an SQL database to store and retrieve data The Apache backend is the most efficient for serving SQL content so that is why this stack is so powerful is that we have 4 major elements which are each the best for their own assigned specific tasks. I am creating this in the cloud on a droplet using DigitalOcean. From my fresh server I started with updating Ubuntu and then connected to the command line interface through SSH on my work machine. From the command line I installed Apache and configured the firewall to initially accept HTTP and HTTPS traffic. Later I will add an SSL certificate and make sure all HTTP requests are upgraded to HTTPS for security. Apache2 on Ubuntu up and running Here I have now confirmed that

Using GraphQL Clients

Image
This will build upon the previous setup and creation of an express GraphQL cloud server. The server has a mounted HTTP API endpoint using GraphQL and express. The Ubuntu virtual machine is in the cloud and there is an initial data set to test with. I send my initial request via curl as follows: curl -X POST \ -H "Content-Type: application/json" \ -d '{"query": "{ hello }"}' \ http://138.68.62.109:4000/graphql/ I now get the following response: {"data":{"hello":"Welcome to your new Express GraphQL Server Jason!"}} % There are a few different ways in which the data can be accessed. Above I used curl. In the previous article I showed how the GraphiQL interface can be used and here I am showing how this can even be accessed from a regular browser. I just navigate to the HTTP API endpoint I have created and enter the following: And from this query above the following data below is returned:

API's (How to run an Express GraphQL Server)

Image
This example builds off of the other initial GraphQL setup and initialization so I will begin from the end of that article. I already have an Ubuntu virtual machine in the cloud with Node.js & GraphQL. The cloud server initial setup is done at this point. So now, I am going to continue configuring and modifying a few things to turn this virtual machine into an Express GraphQL Server. I installed express from the command line with:  Next I modified the original server.js file I had created. Here I now have a new module which is 'express'.  This will help me to be able to run a webserver and then will mount a GraphQL API server on the HTTP endpoint I have assigned of "/graphql" on my Ubuntu cloud virtual machine. And now things are beginning to get more interesting. I now have a visual interface with which I can send queries to the API HTTP endpoint. So here when I send a query for '{ hello }' the contents of the variable within my function ar

Setup & Initialization for - GraphQL using Node.js

Image
The title here is getting started with GraphQL so this is just a primer for using GraphQL in Node.js in the cloud. I first created an Ubuntu cloud server, installed Node.js and GraphQL. Then I was ready to load up a server file and fire up a node. For this introductory example I am going with a classic "Hello World!" program which just shows how to setup and initialize a basic API request. GraphQL does get more interesting and I will show more complex queries in later posts. The goal here is just to show how to get up and running with querying API's using GraphQL. First I created a new directory using the CLI(command line interface) that I am connecting to via SSH from the terminal on my home work machine. Once I went through all the server setup and installations it was time to test out the system. For this I created a server.js file and added the contents seen below. Here a schema is defined with a query that will return a string with the value stored in  '{ hel

Simple XMLHttpRequest (AJAX)

Image
Here I am illustrating how to do a simple XMLHttpRequest to do an asynchronous call to a file. I first created an index.html page where I created a constructor of "xhr = new XMLHttpRequest(); ".  And then on another simple text file. I put the text of "AJAX - XMLHttpRequest!". Then I defined an if else statement for what to do in case of an error (404) or success (200) message. Below  I am saying that if there is a 200 success message and my document is found to then create a window pop up alert with the contents of the file. This could be used to perhaps pop-up a verification code or something else that I want to load asynchronously in regards to the rest of the data being loaded.  This creates an alert box that displays the contents of the dom.txt file. Next I create an error message to be displayed in the console if the file is not found. And finally I finish defining the xhr options in that here I am getting an object called dom.txt and that I want it to l

Creating Website Cookies For Return Customers

Image
Session cookies are small pieces of data that a website sends out to user's web browsers. They can be used to store stateful information or to record the user's browsing activity. This is usually used by sites to either ensure security by making a user's account information only visible when they are logged in. I can make the cookies and access expire upon logging out. Alternatively, I can use the browsing activity records to do targeted advertisements or offer targeted discounts to customers. For this simple cookie I am calling "site_cookie_1" I want to track returning customers. The cookie also has a built in expiry of one day upon which it will in essence self destruct. Next I wrote in PHP to reflect to the screen whether the user has a cookie or not. Once the cookie is set there is a visual confirmation with the cookie name echoed back to the user. This can also be sent to a text file for logging purposes. So now when I first visit the cookies.php

Automated login with PHP and cURL

Image
Using PHP and cURL to login to an account automatically can be quite useful. Sometimes I may need to access a user's information from another website to populate a form, facilitate a user sign up or to initiate some other feature. In this case when the user provides credentials I am able to use cURL with PHP to do an automated login to retrieve the user's information.  In the simple login form I have created below to test out this script there is a username and password field at login1.php. Upon submitting the form, the process.php file handles the querying of the database and confirms whether or not a user is to be authenticated. Since here I am automating the process I can bypass the manual login1.php form and just send the $data array to the process handler file of process.php. In this first screenshot below you can see the first page a visitor will arrive at. They will be first asked for a username and password. In this process since I am skip

Configuring an Nginx Server as a Reverse Proxy for an Apache Server

Image
For this example I created and configured an Ubuntu virtual machine in the cloud. I next installed Nginx and Apache servers on the machine. Finally I configured Nginx as a reverse proxy for Apache. The purpose of this is that different servers are more efficient individually at delivering specific types of content. Specifically Nginx is better at serving static content and Apache is better at serving the backend data as is found in SQL servers. The process has quite a few steps. Once I had my Ubuntu server in the cloud I then proceeded to update all the software and install Nginx. A quick curl command of curl -I localhost shows that my localhost is now Nginx. So at this point I was now able to proceed to the rest of the configuring. Nginx is now up and running as my localhost. Next I installed an Apache server on the virtual machine which will be serving the backend content from a database. Since Nginx is running on port 80 and Apache wanted to start on port 80 as well, I we

Python - Network Programming - Cloud based reverse shell

Image
Creating a reverse shell to do work or modifications on a remote machine can at times be essential. Many years ago I remember working in sales or as an analyst and not understanding networking and systems. This process seemed magical as someone could remotely start opening files, updating printers or just troubleshooting any machines from what seemed to be a far off distant land (the server room). Now I have learned a lot since those days and here I want to show in a simple way how a reverse shell is created which is just a connection from one computer to another. In essence I can do anything with the remote machine as if I just plugged in a monitor and keyboard to it.  There are two main parts in creating this type of connection. First I have a server and then I have a client. Each will be using a separate Python file for this process to work. I find that the easiest way to understand large or complex systems is to understand the individual components that make up the gre

Python Automation - Scheduling tasks for specific days or timeframes

Image
Python is a powerful language that helps with automating tasks from file backups, running commands or transferring server files. In this script is a simple example of how Python can be used to schedule a task. For illustrative purposes the function here being called is wait() which in essence does nothing but literally sleep for 1 second as it waits to print ("jason"). The function can tell the system to do a backup of all user's files, or perhaps check inventory levels or to send weekly reports to certain people. The possibilities are endless but the building block is here and the essential and great part is that I can create a python program to automate tasks and routines to run once every few seconds or once every week. Simple Python scheduler