Archive: 2022/4

SQL Developer 깃허브 연동하기

Step 1. Github 준비 Github에서 SQL Developer와 연동할 새로운 Public Repository를 생성한다. Settings > Developer settings 에서 새로운 Personal access token을 발급받는다. 새로 생성한 Repository의 이름을 입력하고 Select scopes에서 repo를

Oracle 19c Installation in Windows11

Step 1. Install Oracle Database Run the setup file as administrator and follow the procedure below. If the following error occurs, go back to the beginning and change to ‘Software Only Se

Crawling Music Chart Top100

Website Info Request URL : https://music.bugs.co.kr/chart Request Method : GET User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/100.

Crawling Headline News

Check the Website Info Access Developer Tools of the website and enter the Nework tab. Type ctrl + R and enter the Doc tap. Enter a site and check the Headers tap with the site. Copy the valu

Crawling Data from Web

Step 1. Set virtual environment Create a new directory under the C drive and virtual environment. 123$ mkdir crawling && cd crawling$ virtualenv venv$ sourve venv/Scipts/activate Install so

Spark Installation in WSL2

Step 1. Install required files Install java and spark file. (Skip if already installed.) 123$ sudo apt-get install openjdk-8-jdk$ sudo wget https://archive.apache.org/dist/spark/spark-3.2.0/spark-3.

Spark Installation on Windows11

Step 1. Install Java DK Download Windows Installer. URL : https://www.oracle.com/java/technologies/javase/javase8u211-later-archive-downloads.html Run the download file as an administrator. Modi

ElasticSearch and Kibana Setting in WSL 2

Step 1. Install Package Update the system package and install a package related to HTTPS. 12$ sudo apt update$ sudo apt install apt-transport-https Install Java and check the version of Java. 1234

Link VSCode with Remote WSL

Step 1. Install VSCode URL : https://code.visualstudio.com/download Download the System Installer for each OS. Check ‘Add to PATH’ and reboot after installation. Step 2. Link Remote WSL Ins

Establishing an Airflow Data Pipeline

Step 01. Create a Virtual Data Create dags foler below (venv) airflow-test folder. 1234$ mkdir dags$ lsairflow-webserver.pid airflow.cfg airflow.db dags logs venv webserver_config.py Install