While installing tensorflow using pip, you might get stuck at grpcio installation. Solution Then, Happy coding!
Environment: Ubuntu 18.04 Error 1. OSError: mysql_config not found Solution 1 Error 2. unable to execute ‘x86_64-linux-gnu-gcc’: No such file or directory Solution 2 Error 3. MySQLdb/_mysql.c:46:10: fatal error: Python.h: No such file or directory Solution 3 Final Code Happy Coding!
In this post, we will talk about how to connect the AWS API gateway to EC2. Normally, we use the API gateway to make an API when we are using Lambda. However, for some cases, you might want to use an API gateway as a proxy for the EC2 instance server. Let’s say we have an EC2 instance running the Flask application that we installed from the last post. We want to make an API that is pointing the server…
In Python, to get UNIX timestamp, you need to use the time library like below. Since its type is a float, you need to cast to int if you need. Furthermore, if you want to consider your timezone, you can simply add the hours like below. The following example is the utc+9 case. Happy coding!
Here I introduce several ways to identify if the word consists of the English alphabet or not. 1. Using isalpha method In Python, string object has a method called isalpha However, this approach has a minor problem; for example, if you use the Korean alphabet, it still considers the Korean word as an alphabet. (Of course, for the non-Korean speaker, it wouldn’t be a problem 😅 ) To avoid this behavior, you should add encode method before call isalpha. 2….
First, create an input. Then, we will get the value using ‘id’, ‘class’, and ‘name’ respectively. Easy?
When you deploy your React.js application through Docker, you can deploy with Nginx. The basic principle is, first build a React.js application and move the generated files to “share/nginx/html” in the container. You are just replacing the default Nginx welcoming page to your React.js application. Now you just need to run Nginx. Please check the Docker file below. Check the sample project
In this post, we will build the multi-node Hadoop cluster using three EC2 instances ( one for master, two for slaves). (I will assume that you know how to use AWS. If you don’t, please check this link) To run Map-Reduce task properly, you need enough memory. Therefore, we will use t2.medium type instance. (If you are a student and need some free credit, check this link.) AWS EC2 t2.medium×3 (1 for a name node, 2 for data nodes) Ubuntu…