Sunday, March 17, 2019

Face anonymizer with Python


On this article, I’ll introduce how to anonymize human’s face and the code for that with Python.
Although I’m not 100% sure, when we compare with before, I think the face detection has been reaching certain level of accuracy. If the picture is not so complex, with some accuracy, the faces are detected. With this face detection, relatively easily, we can anonymize human’s face.

Monday, January 14, 2019

Scala like error handling with Python


On this article, I’ll try to do Scala-like error and output handling with Python.
This is my personal memo.

Sunday, December 30, 2018

sum function with divide and conquer algorithm


By divide and conquer algorithm, I’ll write sum function. This article is my personal memo about a book.

Tuesday, December 25, 2018

Functional Kotlin: Book review

These days I've been on the journey of studying Functional programming and as one of the books, read Functional Kotlin.

I'll leave the review of it.

Monday, December 17, 2018

Awesome keyboard: My best buying

I think there are many people who are on the journey of finding the best keyboard for themselves. I was on that. On this post, I’ll introduce my personal best buying of keyboard.

As a data scientist, I spend a lot of time in front of my PC. Although I'm not a work-environment geek, I more or less care about some points. One of those is Keyboard.

Sunday, December 9, 2018

Kuzushiji-MNIST exploring

Kuzushiji-MNIST exploring


Kuzushiji-MNIST is MNIST like data set based on classical Japanese letters.
The following image is part of the data set. As you can see, this is composed of visually complex letters.


On this article, I’ll do simple introduction of Kuzushiji-MNIST and classification with Keras model.

Saturday, November 3, 2018

Data Science with Functional Programming on Python

Data Science with Functional Programming


On this article, I’ll show some functional programming approach to data science with Python. With functional approach, some pre-processing can be concise. Especially when you are reluctant to use pandas library on some situation, this kind of approach can lead to code-readability.

Tuesday, July 17, 2018

How to write Dense block of DenseNets: understanding and coding with Keras


This article covers basic understanding and coding of Dense block of DenseNets. DenseNets is one of the convolutional neural network models. If you have an experience of using fine-tuning or frequently tackle with image recognition tasks, probably you have heard that before.
DenseNets is composed of Dense blocks. It is expressed as the image below, which is quoted from

On the context of the history of convolutional neural network, ResNet helps the network to be deeper without degradation problem by the shortcut path to the output of the Residual module. DenseNets and Dense block is near concept from the different approach.

This article is to help to understand the basic concept of Dense block of DenseNets and how to write that. For coding, I’ll use Python and Keras.
About the ResNet and Residual module, please read the article below.
If you want to know the detail of DenseNets and Dense block, I recommend you read the article below.
When you find a mistake, please let me know by comment or mail.

Monday, July 9, 2018

How to write Residual module: understanding and coding with Keras


This article covers basic understanding and coding of Residual module. If you have experience of using fine tuning or frequently tackle with image recognition tasks, probably you have heard the network name, ResNet. ResNet is composed of Residual module, whose structure is expressed as below.

The image above is from
Basically, deeper neural network contributes to the better outcome. If you have enough computational resource(unfortunately, I don't have), for difficult task, you can approach it with really deep neural network. However, with deeper neural network, the problem of degradation comes, which makes it difficult to train the model. Residual module offers one of the solutions to this problem, meaning that with this, we can make deeper neural network by softening the difficulty of training.
For precise and better understanding, I recommend that you read the paper below. Here, I'll just show summary for simple and concise understanding and coding with Keras.
If there are strange or wrong points, please let me know by comment or message.

Friday, June 29, 2018

An insight into AUC objective function from the viewpoint of evaluation


This article is to think about the model which is with AUC objective function by some evaluation methods. Also, I can say this is to think about the AUC objective function from the viewpoint of evaluation as the title of the article shows.
On the article, AUC as an objective function: with Julia and Optim.jl package, I made a model with AUC objective function. The predicted score by that was distributed in really narrow area, because AUC objective function is based on the order without caring the distance from explained variable. With some evaluation norms, the model's score seems not nice.
About this point, just in case, I'll leave the simple experiment.

Julia: version 0.6.3

Thursday, June 28, 2018

AUC as an objective function: with Julia and Optim.jl package


On this article, I'll do AUC optimization on logistic regression. With Julia's Optim package, relatively easily, we can optimize AUC objective function.

Julia: Version 0.6.3

Sunday, June 24, 2018

Follow simple analysis workflow with Julia


On this article, with Julia I'll roughly reproduce the simple analysis I did on Simple analysis workflow to data about default of credit card clients.
After I wrote that article, I thought to write the following ones. But, I want to follow the same flow with Julia at first. So, I'll do.

Thursday, June 21, 2018

Simple analysis workflow to data about default of credit card clients


These days I had opportunity of reading some papers about finance data analysis, meaning credit score, default rate and so on. Personally, I want to tackle with cutting-edge way as soon as possible. But, it is important to see from basic flow on this kind of case. So, here, on this article, I'll follow the basic work flow like univariate analytics with Logistic Regression.
To focus on basic flow and some characteristics, I'll ignore some manner to the data and modeling.
This article more or less follows the chapter 2 and 3 of the following article.