04Mar/21

Sublime Text Editor – Show All Whitespace

The ability to see all whitespace (spaces, tabs, etc) is not very clear in sublime and not easily accessible.

You can change it by doing this:

Open your user settings (Preferences -> Settings) and add this item in to your settings in between the { and }.

Make sure to properly end it with a comma if its not the last item in the dict, and if it is the last item in the dict, you can remove the final comma character.

However, you easily set yourself some keyboard shortcuts by editing the keymap (Preferences -> Key Bindings). Note put this in between your [ and ] and characters (They keybindings is a list of dicts with keys of “keys”, “command” and “args”).

Likewise, make sure to remove the comma at the end if its the last item and keep it if you have further items in there.

Shift+Alt+D to show all whitespace

Shift+Alt+A to revert to normal

On a MAC replace Alt with Option key

Sidenote: My key bindings so far are simple so its just:

11Dec/20

rhood – robinhood portfolio analysis tool (better net profits per symbol)

Github: https://github.com/bhbmaster/rhood <- download location. install & run instructions.

Using the robin-stocks python module, I created my own robinhood portfolio analyzer called rhood. It parses all of your Robinhood account information provided by the API and outputs a single text output containing all of your portfolio information, order + open positions + dividend information. Mainly, it parses all of your orders and outputs sorted orders, open positions, informative profits, and dividend information. It provides a good figure to your total net gain, and net gain per any position currently owned and previously owned (currently the default Robinhood app doesn’t show this information nicely).

It requires python, the robin-stocks pip package (robinhood api client), and pyopt pip package (2factor authentication module). It can be run on Windows, MAC, or Linux.

This prints a lot of information about your stocks, crypto, and options (see note 1):

  • all of the orders (creates csvs from them as well)
  • all open positions
  • net profit calculations

It provides useful information that I couldn’t find on robinhood app itself; i.e. your profit per stock. Robinhood has a section to show total return, however that seems to clear out if you sell the whole stock. My application doesn’t do that, and it shows you total profit (or loss) for each symbol: stock, crypto, option (see note 1).

* Note 1 – Work in progress: options are not implemented yet. So if you are only using stocks and or crypto you are set, otherwise options are skipped/ignored.

* Note 2: I used the robin-stocks module. However I see there are some other modules that talk with the robinhood API as well. I didn’t use these, as they seem to be older. https://github.com/robinhood-unofficial/pyrh and here https://github.com/mstrum/robinhood-python

17Nov/20

Create Github Repo On The Go From The Shell – Github API

Sometimes you start coding something and you don’t yet realize if it will be a project worth sharing on github. We don’t always think about this as we begin coding. This is a write up on how to deal with the time you code something up, and then – after awhile – you realize you want to share it. This method, also works for creating fresh new repos that you haven’t started coding yet.

Of course the normal practice is to create a github repo from the browser and then follow the code to git init it.

Thats all fine and nice, but its time consuming to open up the browser and create the github repo. Luckily, we can do it using the github api.

First you need to get a github API key. Then using curl you can send a command to github to create your repo for you. There are many settings you can tweak. Specifically for us we want the basic; public repo without any commits or files (no README.md files).

Step 1 – Create API Token

First you have to create an authentication token – personal token:

  • Login to github.com on your browser
  • Go to Settings -> Developer settings -> Create Personal Access Token
  • Hit Generate button
  • In the note textbox, write its purpose. ex: “creating repos from command line
  • Then give it proper access in the scopes:
    • Check on everything in the repo section
    • Check on gist
  • That is it. When you submit this info, it will give you an access token (long alphanumeric string)
  • Save that access token string. We will be using it in our curl commands. This token is the equivalent of providing a username and password (so don’t lose it and dont share it)

Step 2 – Command Line

After you get your key you can now use it in the shell. For example’s sake we use abc123 as the key (your key will have more characters).

Here is how it will look like in your workflow:

  • First, create directory and code some stuff
  • Realize you are making a repo
  • git init the repo. That only saves it locally
  • Make a commit
  • Now create the github repo using curl command. Note we set auto_init to false (by default its true) so that it doesn’t create a first commit with a template README.md file. Also, we make it a public repo, so we set private to true.
    • Change abc123 to your authorization token alphanumeric value
    • Change REPONAME to your repo name. Only use these chars alphabet, number, dot, underscore and minux: A-Za-z0-9_. -.
  • Then set the git origin, which is the remote repository server and repo. Make sure to use the https://github.com/USERNAME/REPONAME.git link; If you use wrong link remove with git remote remove origin. This link is seen in the curl output (look for “clone_url“)
    • Change USERNAME to your github username and REPONAME to your reponame
  • set the branch (master or main) to upstream and push

More Info:

  • More information on github api. Such as more options to pass in the json string with the -d argument to affect the type of repo that gets created: https://developer.github.com/v3/repos/#create-a-repository-for-the-authenticated-user
  • The simplest form of this call curl -H "Authorization: token abc123" https://api.github.com/user/repos -d '{"name":"REPONAME"}' would create a repo that has an initial commit. However, for a fluid work process we don’t want that, so we add the option auto_init: false (if not provided; this option is set to true). Also we set private to false, so that we get a public repo.
  • Previously, you could use the api without a token using your username and password; that has been deprecated out as its unsafe. the commands looked like this: curl -u user:pass https://api.github.com/user/repos -d '{"name":"REPONAME"}'

Making an alias for easier use

It might be annoying to always type those long commands. So you can write an alias and stick it in your .bashrc or .bashprofile. However, that alias is really long. I prefer to create an environment function – it’s a bash function that can be called from the shell; it’s just a regular bash function that was created in the shell instead of in a script.

Here is what I have in my bashrc/bashprofile:

Don’t forget to comment out whichever alias function you don’t want to use (ALIAS 1 or ALIAS 2). I personally use ALIAS 1 as I don’t like having extra files).

For ALIAS 1, don’t forget to put the Auth Token in the variable, thus replacing abc123.

For ALIAS 2, don’t forget to create file ~/.github-api-key with your key: echo "abc123" > ~/.github-api-key

If you just created the script don’t forget to source your .bashrc or .bashprofile (or whichever file you put the script in): source ~/.bashrc

Using the Alias / Function on the Go

Now finally you can use like this – just an example (you can use it other ways – like you can start with the CreateGitRepo command):

  • Create a directory Project1:
  • Change into the dir:
  • Code some stuff up:
  • Initialize the local git repo in the current dir:
  • Stage the current files (wow.js):
  • Commit the staged to the local repo with a descriptive comment:
  • Now create the empty public remote repo (in otherwords create the repo on github):
  • This will show you a lot of lines output if all worked correct.
  • So far the repo has been created but the code has not been pushed to it yet. the final 2 lines of CreateGitRepo command help you with the final 2 commands to set origin and push the code to github. for this example those 2 commands would look like this:

Create Private Git Repo

After using this function a bit, I realized having a private repo creator is just as useful. Here is the same alias and function made for private. The alias and function have an extra suffix to differentiate them. Copy and paste into your .bashrc or .zshrc.

Final Thoughts

All of these commands create a public repo. You can modify that by setting private to true (change "private": false to "private": true). If you want you can even create your own function for that.

The end

12Nov/20

Backup file[s] to Dropbox (without syncing)

This github page, https://github.com/andreafabrizi/Dropbox-Uploader, has the dropbox-uploader tool which I use to backup content from servers (Linux, Mac, etc) without having to sync my Dropbox content to the local disk on your server.

It uses the Dropbox API (docs here) . You can use the API pretty easily with curl commands, but for files over 150 MiB it gets complicated (chunking and such).

Before using the uploader, get an API key dropbox, Here is how you do that.

Note that there are 2 levels of access each API gets (which you configure):

  • Full access – allows the API to access your entire Dropbox (so if it fell into wrong hands it could access/write/delete into anything – even go as far as delete everything). The benefit is that a user can specify which directory in their Dropbox file system to backup files to.
  • Application access – This access limits the API to only the /Apps/<appname> directory (if the /Apps/<appname> directory is missing it will be created on the root of your Dropbox file system). So you can only backup content to /Apps/<appname>/. Within that directory the API has full control (well the access can be fine tuned)

For backup purposes just use the application level access, there is no reason for it to potentially have access to all of your Dropbox data.

More info on these access levels and Oauth authentication is here, https://www.dropbox.com/lp/developers/reference/oauth-guide

Steps by example:

First install dropbox_uploader.sh to any location of your choosing in your filesystem.

Now create a db.conf file. If you run dropbox_uploader.sh without the -f option, it will prompt you for your OAUTH KEY and create the conf file in its default location. I prefer putting it in a custom location.

echo "OAUTH_ACCESS_TOKEN=gvi4325ffFAKEKEYwadfasd-i234asdfa" > ~/backups/db.conf

Then if I want to copy a file called yourfile.txt, first I delete it from the destination. I do this to avoid possible time consuming hash-checks, because if the file already exists at the destination, this script does a hash check.


./dropbox_uploader.sh -f ~/backups/db.conf delete /yourfile.txt

Then I upload the file, by specifyin the source and destination location. Note the destination location must start with a forward slash. If your key have full access it will go to Dropbox:yourlocation. If your key has app access it will go to Dropbox:/Apps/appname/yourlocation. If you are unsure if the directories, exist, don’t worry the API creates the needed directories (even if they are a few nested ones).

./dropbox_uploader.sh -f ~/backups/db.conf upload ~/yourfile.txt /yourfile.txt

Cron

I use this tool to backup servers (data compressed with tar+xz => good high compression) to dropbox. In my backup script which is called by cron daily or however often, I do all of the above steps. I create the db.conf everytime, that way I can see all of the settings and commands from the backup script and don’t have to rummage thru my filesystem for my config file.

The end

09Nov/20

xargs parallel note to self

Xargs is useful to run in parallel. Its parallel processing is very efficient. Read this post about its efficiency and this one about basic commands.

Below, is my favorite way to run with a single-command method repeated (or parallelized):

Replace inputlist with anything. Each line of input list gets run once by command. In the command you can use {} if you need to call the line.

Below is a multi-command method:

Note: recommend to use single quotes on the outside as seen here

Note: you can redirect the output at the very left outside of the single quotes, this way each run’s output is saved. you can save each seperate run if you redirect inside the quotes.

Ex redirection examples:

The end

29Oct/20

Python + Guitar + All Notes

I came across this medium article about python and guitar strings and plotting scales. It has an interesting Jupyter notebook to work with, allowing to plot scales for all of the chords. It was great except it only covered 20 frets. My guitar has 24 frets. So I modified the script to allow for 24 frets & saving the plots. (I have note tested with more then 24)

So you can get something like this:

C Major scale (every whole note):

E Minor Blues scale:

Etc, the rest can be viewed from my Github (go into the Scales directory)

I then came across articles like this that also mapped each guitar string note to a midi value from 16 (low E) to N. N is the highest note. So in my case of 24 frets, N is 64 as that would be the 24th fret on the high E string. Each note increment is +1. I found this fascinating. So I modified the scripts to print all notes + their midi values. Immediately you see the pattern that every 5th fret is the same as the string above it (besides the change from G to B string).

Here is every whole note with midi values. This is also the C major scale:

If the images are too small then just zoom in.

23Oct/20

Python remove duplicates similar to bash uniq + sort

26Jun/20

iostat service time (svctm) rule of thumb

iostat service time is a very useful metric when analyzing disk performance and finding bottlenecks

service time is essentially the inverse of IOPs

so if an operation takes 1ms to service, then your IOPs are 1000 (you can complete 1000 operations in a second if that operation took you 1 ms to complete)

the formula for this is as follows, just put this into google and it will do the math for you:

(S)^-1 = ? hz

S is the service time in milliseconds. ignore the hz word, that just to convert the output to IOPs instead of kiloIOPs or etc..

in your calculator its the equivalent of this:

IOPS=(S/1000)^-1

for 1 ms service time we have 1000 IOPs

for 2 ms service time we have 500 IOPs

for 10 ms service time we have 100 IOPs

Generally, for SSDs I like to see service times between 0 and 1ms (it can jump every now and then above 1ms, but if it does that often look into speeding up your SSDs; perhaps you need to disable or enable disk caching)

For HDDs services times between 1 and 10ms are good. Between 10 and 15 is ok. Anything above and your disks are pretty busy.

Ex: running iostat -x 3 on my NAS we see HDD service times between 1 and 10 so we are good and dont have any HDD bottlenecks

If I saw these numbers consistently on an all SSD NAS then I would be worried.

  • svctm is how long a request takes to process outside of the OS.
  • where as await is how long a requests take to process in all (within the OS and outside of the OS).

I am doing a big write operation on my NAS so you see my w_awaits (write) are reading numbers but not my r_awaits (read). if I was reading then my r_awaits would have values. if I was doing both then I would have values in r_await and w_awaits.

notice that awaits will always be bigger then svctm, this is because its where the time measurements are taken, the await will always be bigger as it adds time it took to process within the OS as well.

Note: this was just a look at awaits + svctm. based on my other metrics, my queue size looks entirely too big, so if my NAS shutdown right now, I would have many operations not written. this might result in a corrupted filesystem.

15Jun/20

Favorite Watch of 2019+2020

I am not a fan of bulky watches, I like them sleek and gorgeous. The winner goes to my friend’s company:

https://durdenwatch.com/

The Durden watch is a sleek and sexy watch. I have had the privilege to own both types. Personally I like the black background watch the most, but the white one was also very beautiful. In the end, both look amazing and last a long time. Mine lasted a year until I lost both of them, I like to imagine they are still running where ever they might be. The white one has better versatility to scratches – you don’t notice them. The only downside is that you will notice scratches on the glass surface easier on the black watch.

Why is it called Durden? The name comes from the Fight Club movie. It is a reference to the main character “Tyler Durden”. My friend has always enjoyed that movie.

So if you are a Fight Club fan, then this watch is a must have.

16Mar/20

Coronavirus Covid19 Dashboard ( covid19plots.py + usa-ca/county-plots.py )

A Coronavirus COVID-19 dashboard for the world and each country created by me. There are log y-axis scale plots and normal y-axis scale plots. Also a plot showing trends of each California county – I created this to help understand what tier* each county will be at any given date. Source code is provided below:

Source code: https://github.com/bhbmaster/covid19 .

Data Sources

World covid data for each country gathered from here: https://pomber.github.io/covid19/timeseries.json

For more info on this data go to the github link: https://pomber.github.io/covid19/

The California county data is gathered from data.ca.gov covid-19 cases: https://data.ca.gov/dataset/covid-19-cases/resource/926fd08f-cc91-4828-af38-bd45de97f8c3

This provides the data in a csv format downloaded here: https://data.ca.gov/dataset/590188d5-8545-4c93-a9a0-e230f0db7290/resource/926fd08f-cc91-4828-af38-bd45de97f8c3/download/statewide_cases.csv

California County Tier System Info

California uses a tier system to see what counties can have which businesses open.

This tier system uses positivity rate and daily new cases per 100,000 over 7 day rolling average (also known as moving average). Since the positivity rate lately is very good, I do not consider it in my California county plots. I plotted the daily new cases per 100K with a 7 day moving average. Turns out there is an offset/correction that is applied which I couldn’t figure out.

To see the corrected value check this link (it also explains the tiers):

Additional Info

  • The charts are updated every 6 hours starting at midnight PST. However, the values to the data source are updated daily, so don’t expect new values until after midnight.