David Hedges

  • Home
  • About
  • Contact
  • Business Blog
  • IT Blog

Working with GitHub-copilot

October 25, 2023 by dhedges Leave a Comment

Over the past few weeks, I’ve been using GitHub Copilot to assist me with the code and scripts that I’ve been developing. Overall, I have been impressed with its performance and functionality.

I have come across claims that AI can be used to generate code or scripts, but in my experience, it hasn’t worked well. While it may provide a good framework to start with, attempting to write entire code using AI results in unusable garbage that sounds authoritative and confident but is often incorrect. I wanted to clarify this as there seems to be a lot of talk about AI writing all the code, but that is not how I have been using it.

GitHub Copilot works by suggesting code snippets as you write your own code. I’ve found that these suggestions significantly speed up my coding abilities. Whenever I reach a point where I would typically search for a specific command or module online, I can often add a comment describing what I’m trying to accomplish and receive a relevant suggestion from Copilot. While not all of the suggestions are exactly what I need, they are often close enough to require only minor editing. Additionally, I’ve had Copilot suggest code based on what I have already written, making my coding process even faster. It has even suggested examples that combine steps I would have previously done separately, resulting in shorter, more concise code.

Based on my observation, AI is not the job killer as some people claim. Instead, it enhances your abilities and accelerates your writing process. It can be likened to how word processors made typing easier and faster than using typewriters.

Using Python Download Mail Attachments

February 23, 2023 by dhedges Leave a Comment

We had been using an online fax that sends faxes as a pdf attachment to email. We didn’t want to manage the email account, but instead wanted to create a script to strip the attachments and then convert them to compressed tiff format.

You’ll need python and imagemagick installed to make it all work. in python you’ll also need to install the poplib and email modules. you can install these using pip:

pip install poplib
pip install email

if you are using gmail or mail hosted off of google, you’ll need to change your security settings. First, click your account in the upper right, then choose “manage your google account”. Select security from the menu on the left, then scroll down looking for “less secure app access”. Choose Less Secure app access, and turn it on.

the following is the python code I had put together to get this to work.

#!/usr/bin/python3
import poplib
import email
import os
import time
from subprocess import Popen, PIPE
'''
Python script to pull pdf atachments from email
Written by: David Hedges:: dhedges at hedgesadvantage dot com
'''

savedir = "<path to save location>"

server = poplib.POP3_SSL('pop.gmail.com', '995')
server.user('<email login')
server.pass_('<email password>')


emails, tbytes = server.stat()
msg_list = server.list()


def pdf2tiff(dpath,name):
    '''convert the pdf to tiff'''
    newf = name.replace('.pdf', '.tif')
    os.chdir(dpath)
    runner = Popen(['convert -density 200x100 -compress zip -depth 1 '+name+' '+newf], shell=True, stdout=PIPE).wait()
    '''fix perms'''
    os.chmod(dpath+'/'+name, 0o777)
    os.chmod(dpath+'/'+newf, 0o777)
    '''remove pdf'''
    os.remove(dpath+'/'+name)


for i in range(emails):
    msg_data = server.retr(i+1)
    lines = msg_data[1]
    str_message = email.message_from_bytes(b'\n'.join(lines))
    #print(str_message)
    for part in str_message.walk():
        print(part.get_content_type())
        if part.get_content_type() == "application/pdf":
            filename = part.get_filename()
            if not(filename):
                filename = "test.pdf"
            #print(filename)
            tvar = time.strftime("%H%M%S")
            fname = filename.replace('.pdf', tvar+'.pdf')
            fp = open(os.path.join(savedir, fname), 'wb')
            fp.write(part.get_payload(decode=1))
            fp.close
            pdf2tiff(savedir, fname)
            server.dele(i+1)

server.quit()

Migrating to RedHat 9

October 11, 2022 by dhedges Leave a Comment

RedHat announced a while back that someone with a free developer account can run up to 16 subscriptions (https://developers.redhat.com/articles/faqs-no-cost-red-hat-enterprise-linux#general). After there had been the announcement that CentOS would become an upstream OS to RHEL, we had been looking to move to Oracle Linux or Rocky Linux. Both of these are very good, and in the end I had worked a bit more with Oracle Linux.

I am now moving my Oracle Linux instances to RedHat 9. RedHat has incorporated some items into their OS that make it a bit more interesting. The addition of kpatch, which Oracle has as a subscription only feature, was one that stands out. The updates to cockpit is nice as well, where you can make it easy to make some basic configurations, and get a quick view of server status, and the ability to perform podman and some vm functions.

Migrating data between Centos and Oracle Linux is no different than moving between any other Linux based OS. The biggest challenge are the apps that are not bundled and verifying that they install correctly and function right on the new OS.

I’m excited to move in this direction and may have some updates on where this leads in the future.

Using Ansible to generate inventory data

April 7, 2021 by dhedges Leave a Comment

Ansible can be used to collect a lot of usable data that can be imported into an api or database driven inventory system, creating an automated way of updating hardware information.

Creating an ansible yml file to pull inventory to a json file is fairly simple:

---
- hosts: all
  gather_facts: no
  tasks:
    - setup:
      register: myinv
    - copy:
        content: "{{ myinv | to_nice_json }}"
        dest: /some/local/path/{{ inventroy_hostname }}.json
      delegate_to: localhost

running the ansible playbook generates a json file for each host. This data can then be parsed and pushed directly into a database, update a system using an api, parsed and pushed into a spreadsheet, or any number of other things that can be thought of.

using python to parse the json files, below is a snippet of how I take the data contained within the json files, and pull them into variables.

#!/usr/bin/python
import glob, json
for file in glob.glob('/path/to/json/files/*.json'):
     hwinf = json.load(open(file))
     if 'ansible_facts' in hwinfo:
          hostfqdn = hwinfo['ansible_facts']['ansible_nodename']
          hostip = hwinfo['ansible_facts']['ansible_default_ipv4']['address']

the json file is fairly easy to pull data from. From here though you have the data, and just need to determine how you will use it.

Next Page »

Copyright © 2025 · Beautiful Pro Theme on Genesis Framework · WordPress · Log in