Brief

Technical

  • Python (FastAPI, Pydantic / SQLAlchemy, Flask, LangChain, aiohttp, AsyncIO);
  • JavaScript / TypeScript, React.js, Redux.js / Redux Toolkit, D3.js.
  • WebSocket, RabbitMQ (RPC, pub-sub), RESTful API dev.
  • Deployment, CI/CD: AWS, Firebase, Docker, GitlabCI.

Dev Tools

  • GitLab, GitHub.
  • SSH, vim, Git, and other common CLI tools.
  • Slack, Mattermost.
  • VS Code (and its derivatives), OpenCode, Claude Code.

General: Languages

  • English (near-native / professional communication),
  • Kazakh (native),
  • Russian (expert / professional communication),
  • currently self-studying Japanese.

More Detailed

In my recent projects, I utilized knowledge in the following areas:

  • Programming languages & packages: Python (w/ AsyncIO, aiohttp, Flask), JavaScript, React.js, Redux.js (RTK), D3.js.
  • WebSocket, RabbitMQ (RPC, pub-sub), and RESTful API-based (OpenAPI 3 spec) service-to-service communication.
  • Used S3 API in Python to integrate with S3-compatible object storage.
  • Dev. tools/IDEs: VS Code, GitLab web UI, SSH, Git, and common command-line tools.
  • CI/CD: Used GitLab CI on a self-hosted server for safe and efficient offline experimentation.
  • Containerization & virtualization: primarily used Docker and docker-compose, but also experimented with alternatives (Podman, qemu, LXC, and VMs) on Debian and Proxmox hypervisors.

Some other tools and concepts I am familiar with:

  • Collaboration tools: Slack, Mattermost, GitLab Issues.
  • Documentation/markup notations: Markdown, LaTeX.
  • Other common notations: JSON, YAML, TOML, OpenAPI.
  • Cloud/distributed computing: currently taking an online course on cloud computing and AWS.
  • Distributed storage: experimented with deploying a minimal Ceph cluster (open source storage system) on RockyLinux and Debian VMs.
  • Data structures and algorithms: good understanding of the time and space complexities of code, common software design patterns, and have studied computational methods.
  • Operating Systems: MacOS and Linux (Debian/Ubuntu, CentOS/RockyLinux/AmazonLinux), OpenWrt.

Homelab

I use a low power multi-node homelab cluster to run my experiments and expand my understanding of different concepts:

  • HA, reverse proxy (Traefik with SSL/TLS, nginx).
  • Virtualization: docker, qemu, lxc (Proxmox).
  • Storage and backup solutions: TrueNAS, Ceph.
  • Networking: VLANs, WireGuard, routing, DNS.
  • Self-hosting of dev-productivity services, e.g. GitLab, S3 API, password management service, etc.
  • and experiment with various operating systems and software.

One of my ongoing projects on this setup is the development of a system that would read real-time trading data from Binance (and similar platforms) using their public API interface and reproduce several metrics available on their web UI. The idea for this project is to practice developing robust systems that can collect and analyze real-time data from various public sources.

Narrow-area (and rusty) skills

Finally, in no particular order, below are some of the tools & tech I have used in various projects in the past and those that are less generic (e.g., used in specific fields of experimental research). While they are not applicable in modern project and my knowledge of these might not be up-to-date, they are a part of my journey in tech and academia (and often contribute to my thought process):

  • Programming Languages & IDEs: C, Java, PS/SQL, PHP, Eclipse, Android Studio.
  • Software tools/packages: PyQt, Gstreamer (for writing a real-time data processing pipeline), NI LabView, LIGO Digital Control System Tools, EPICS/MEDM, SVN, VirtualBox.
  • Used Matlab and Python for time-domain and frequency-domain analysis of measurement data.
  • Updated analysis scripts to run on high-performance computing clusters with HT Condor (within LIGO-Caltech and KAGRA-Uni. of Tokyo).
  • Other research-related experience: conducted tabletop optical experiments, developed/improved calibration procedures for large-scale gravitational-wave observatories.