Table of contents

  1. PyTorch find minimum of a custom function with an optimiser (Adam)

    March 2022

  2. Binary Cross-Entropy vs Mean Squared Error

    March 2022

  3. Binary heap tree proof: index of left child is doubled index of parent + 1

    January 2022

  4. Regex-based tokenizer

    January 2022

  5. emscripten + node.js

    December 2021

  6. Android: dim/darkness of a Dialog

    April 2021

  7. Testing MutableStateFlow without flakiness

    April 2021

  8. Mockito + kotlin Object + BitRise = fail

    April 2021

  9. Python: regex-based tokenizer in 4 lines of code

    April 2021

  10. Full-height BottomSheetDialogFragment

    March 2021

  11. My experience of targeted ads in VK

  12. Picasso, Glide, or Fresco

    March 2021

  13. Painless transparent status and navigation bar

    March 2021

  14. Android view crop: clipToPadding and clipChildren

    March 2021

  15. NVidia GT1030: how to run TensorFlow on Ubuntu 16.04

    March 2018

  16. FreeMind TODO intro!

    April 2017

  17. Normal Mapping Visualisation video

    March 2017

  18. ffmpeg: gapless video splitting and concatenation

    December 2016

  19. AndroidStudio bug-report. Can’t use debugger with NDK library

    March 2016

  20. BUG: Android Kotlin dex transformation error

    February 2016

Content

PyTorch find minimum of a custom function with an optimiser (Adam)

I caught myself thinking, that most of the tutorial on PyTorch are about neural networks, meanwhile it’s a quite general optimisation framework. There’s a tutorial about how to use autograd, but, still, using autograd is not the same as using an already written high-quality optimiser like, Adam, Adagrad, etc.

So I decided to start with a minimum example and find minimum of x^2 + 1. Weird, but I have not found many tutorials and got stuck with that simple problem. Conor Mc wrote an article, but, still, it uses some custom class based on nn.Model. There also was an article by Bijay Kumar, yet, still, it used nn.Linear layer! 🙂 So, yeah, it took me some time to figure out a working solution and here it is:

from matplotlib.pyplot import *
from torch.optim import Adam
from torch import Tensor
from torch.nn import Parameter

X = Parameter(Tensor([10]))

opt = Adam([X], lr=1)
losses = []
for i_step in range(10):
    y = X ** 2 + 1
    opt.zero_grad()
    y.backward()
    opt.step()
    losses.append(y.item())

plot(losses)
show()
Continue…

Binary Cross-Entropy vs Mean Squared Error

In this post I’m trying better understand Cross-Entropy loss and why it is better than Mean-Squared Error.

On the plot below you can see, that, Mean Squared Error may provide just inadequate and, sometimes, unoptimisable values on low amount of noised data.

TODO: non-noised data, big amount of data, non-linearly separable data.

Regex-based tokenizer

This post is not mine, but I found it so useful, that, when I lost the url, I finally decided to save the content to my blog. Also, there’s a guide in the official Python documentation, but it looks a bit more complicated to me.

import re

SCANNER = re.compile(r'''
  (\s+) |                      # whitespace
  (//)[^\n]* |                 # comments
  0[xX]([0-9A-Fa-f]+) |        # hexadecimal integer literals
  (\d+) |                      # integer literals
  (<<|>>) |                    # multi-char punctuation
  ([][(){}<>=,;:*+-/]) |       # punctuation
  ([A-Za-z_][A-Za-z0-9_]*) |   # identifiers
  """(.*?)""" |                # multi-line string literal
  "((?:[^"\n\\]|\\.)*)" |      # regular string literal
  (.)                          # an error!
''', re.DOTALL | re.VERBOSE)

If you combine this with a re.finditer() call on your source string like this:

for match in re.finditer(SCANNER, data):
   space, comment, hexint, integer, mpunct, \
   punct, word, mstringlit, stringlit, badchar = match.groups()
   if space: ...
   if comment: ...
   # ... 
   if badchar: raise FooException...
https://deplinenoise.wordpress.com/2012/01/04/python-tip-regex-based-tokenizer/

emscripten + node.js

For some reasons I couldn’t find this info easily. For debugging purposes it would be nice to be able to connect C++ code with node.js. Official emscripten documentation explains how to run code in browser. But node.js works differently. Let’s create a simple C++ file:

main.cpp

#include "main.h"

float lerp(float a, float b, float t) {
    return (1 - t) * a + t * b;
}

main.h:

float lerp(float a, float b, float t);

It will not have any emscripten-related code, so I could use my IDE normally, without it knowing about any emscripten at all 🙂 Okie, let’s go:

emscripten.cpp


#include <emscripten/bind.h>
#include "main.h"

using namespace emscripten;

EMSCRIPTEN_BINDINGS(my_module) {
    function("lerp", &lerp);
}

Now we need to compile the code into a js code:

emcc -s WASM=0 --bind -o main.js -s EXPORT_ES6 -s ENVIRONMENT=shell embindings.cpp main.cpp

package.json

{
  "name": "untitled",
  "version": "1.0.0",
  "main": "index.js",
  "license": "MIT",
  "type": "module"
}

index.js

import wasmModule from './main.js';

const instance = await wasmModule();

console.log(instance.lerp(0, 1, 0.5));

Also a note about some flags:

-s DISABLE_EXCEPTION_CATCHING=0

-s ALLOW_MEMORY_GROWTH=1

-s O1 is the maximum optimization level allowed. -s O2, -s O3 and -s Os result in a run-time fail of module importing.

Android: dim/darkness of a Dialog

class SomeFragment : BottomSheetDialogFragment() {
    override fun getTheme() = R.style.SomeThemeForBottomSheetDialog
<item name="android:backgroundDimAmount">0.35</item>
<style name="R.style.SomeThemeForBottomSheetDialog" parent="@style/ThemeOverlay.MaterialComponents.BottomSheetDialog">
    <item name="bottomSheetStyle">@style/SomeBottomSheet</item>
</style>
<style name="SomeBottomSheet" parent="@style/Widget.Design.BottomSheet.Modal">
    <item name="android:background">@android:color/transparent</item>
</style>

Testing MutableStateFlow without flakiness

  1. We need the observable itself:
private val _countdown = MutableStateFlow(0)
val countdown = _countdown.asStateFlow()

2. Now we need some collector, that we’ll need to use for testing:

val scope = CoroutineScope(Job() + Dispatchers.Main)
val countdownObserver = mock<FlowCollector<Int>>()

scope?.launch { countdown.collect(countdownObserver) }

3. Let’s test:

_countdown.value = 0
verify(countdownObserver).emit(0)

_countdown.value = 0
verify(countdownObserver).emit(0)

The complete snippet.

  1. It fails and it’s the correct behaviour.
  2. Dispatchers.Main is the must! Dispatchers.IO will give you a lot of flakiness!
package com.visa.mobile.feature.payments.usecase

import com.nhaarman.mockitokotlin2.clearInvocations
import com.nhaarman.mockitokotlin2.mock
import com.nhaarman.mockitokotlin2.verify
import com.visa.mobile.common.TestCoroutineRule
import kotlinx.coroutines.*
import kotlinx.coroutines.flow.FlowCollector
import kotlinx.coroutines.flow.MutableStateFlow
import org.junit.Rule
import org.junit.Test

@InternalCoroutinesApi
@ExperimentalCoroutinesApi
class MutableStateFlowTest {
    @get:Rule
    var coroutineRule = TestCoroutineRule()

    @Test
    fun `test MutableStateFlow`() = coroutineRule.runBlockingTest {
        val countdown = MutableStateFlow(0)
        val countdownObserver = mock<FlowCollector<Int>>()

        val scope = CoroutineScope(Job() + Dispatchers.Main)

        scope.launch { countdown.collect(countdownObserver) }

        countdown.value = 0; verify(countdownObserver).emit(0)
        clearInvocations(countdownObserver)
        countdown.value = 0; verify(countdownObserver).emit(0)

        scope.cancel()
    }
}

Mockito + kotlin Object + BitRise = fail

In one of my recent PRs, there was this small change, apart ~1000 other lines of code:

object MockData {
  ...
  val NOW = mock<Now> {
      on { ms } doReturn msTIMEOUT
      on { s } doReturn secTIMEOUT.toLong()
      on { date } doReturn Date(msTIMEOUT)
  }
  ...
}

It gave me lots of such errors on BitRise, though locally everything worked perfectly:

...
com.mycompany.packagename.ClassName100Test > Test case 100 FAILED
java.lang.NoClassDefFoundError at com.mycompany.packagename.ClassName100Test.kt:57

com.mycompany.packagename.ClassName101Test > Test case 101 FAILED
org.mockito.exceptions.misusing.UnfinishedStubbingException at ClassName101Test.kt:35

com.mycompany.packagename.ClassName102Test > Test case 102 FAILED
java.lang.NoClassDefFoundError at ClassName102Test.kt:43

com.mycompany.packagename.ClassName103Test > Test case 103 FAILED
java.lang.NoClassDefFoundError at ClassName103Test.kt:47

com.mycompany.packagename.ClassName104Test > Test case 104 FAILED
java.lang.NoClassDefFoundError at ClassName104Test.java:-2

com.mycompany.packagename.ClassName105Test > Test case 105 FAILED
java.lang.NoClassDefFoundError at ClassName105Test.java:-2

com.mycompany.packagename.ClassName106Test > Test case 106 FAILED
java.lang.NoClassDefFoundError at ClassName106Test.java:-2

com.mycompany.packagename.ClassName107Test > Test case 107 FAILED
java.lang.NoClassDefFoundError at ClassName107Test.java:-2
...

I can’t make any other conclusion but: avoid using Mockito mocking inside of Kotlin Objects.

Python: regex-based tokenizer in 4 lines of code

import re

SCANNER = re.compile(r'''
  (\s+) |                      # whitespace
  (//)[^\n]* |                 # comments
  0[xX]([0-9A-Fa-f]+) |        # hexadecimal integer literals
  (\d+) |                      # integer literals
  (<<|>>) |                    # multi-char punctuation
  ([][(){}<>=,;:*+-/]) |       # punctuation
  ([A-Za-z_][A-Za-z0-9_]*) |   # identifiers
  """(.*?)""" |                # multi-line string literal
  "((?:[^"\n\\]|\\.)*)" |      # regular string literal
  (.)                          # an error!
''', re.DOTALL | re.VERBOSE)

for match in re.finditer(SCANNER, data):
   space, comment, hexint, integer, mpunct, \
   punct, word, mstringlit, stringlit, badchar = match.groups()
   if space: ...
   if comment: ...
   # ... 
   if badchar: raise FooException...

Source

Full-height BottomSheetDialogFragment

val behavior = (requireDialog() as BottomSheetDialog).behavior
behavior.state = BottomSheetBehavior.STATE_EXPANDED
// behavior.peekHeight = 0

behavior.peekHeight = 0 causes a glitch: when I try to swipe the dialog to the bottom, it hides away, yet, still, the background is dimmed.

My experience of targeted ads in VK

1. Small ads (on a side-bar)

Noone sees small ads in a sidebar. Yes, it costs 0.015$/1000 views, but the ad appeared 207 605 times and only 2 people clicked on it. So effective Cost Per Click is ~1.85$ for such ads. So I’d probably have tried to use it to spread some info:

VK Ads 2021-03-24 18-02-49.png

2. Promoted post’s

I had a WAY bigger success in comparison with small ads. It a way more effectively attracted people. More people joined overall and a cost per click was only 66% of the CPC for a small ad. Not bad! Big ads / promoted posts work A WAY better. This time I decided also to follow a bit different approach. Instead of just saying, ‘Hey, community is here’, I decided to describe profits for a potential member.

I created two variations for two different auditories:

b) I just searched all the ‘programming’/’python’ groups and decided to target only users of those groups. It’s only 1000 users, by the way. Though! People tend to join the community with 10 times less ad demonstrations! CTR is 10 times higher! Cost Per Click is almost 4 times cheaper!
a) shown to ‘filtered’ people: by gender/marital status (focus on singles) and whose interests were determined by the social platform as “IT”:

VK Ads 2021-03-24 17-58-03.png

But it’s not the most important difference between them!

People from a first (a broader) group are completely dissatisfied/disappointed:

VK Ads 2021-03-24 18-08-54.png

People from the first group tried to ‘join’ the group to be able to get rid off the ad and, then, they also hid all the group posts.

People from the second group are totally different:

VK Ads 2021-03-24 18-42-48.png

Noone hid any post, nobody reported. People just join! Seems like a perfect ad!

First impressions/emotions/thoughts:

  1. People would like to have an active community. So it’s hard to start, really. People expect there are ALREADY other people to talk to or some services offered. So it’s not enough to buy an ad. There probably should be some waiting list, so people wouldn’t see that the group is empty unless it grows to some size.
  2. To find an audience, I believe, I need to find an audience: communities, concerts/shows, local events, etc. It’s not effective to broadcast/shout: people avoid ads at all cost.
  3. Audiences are small. I need to search for more and more people. It probably, should be like a life style: to look for more and more people.
  4. I can easily hit the limit: there is only 26k people who are even somehow about IT in the whole town. These 26k are not only devs. They may be accountants who needed some IT support in the past, etc. So there are just no programmers in my town. They all go Moscow/big cities. So the rest of those who didn’t leave, just can’t build a community. So it’s not obvious for a beginner, but it’s quite obvious fact if you think about it: but sometimes the goal to make a local community / start a business is just unreachable. 
  5. Looking for people doesn’t take too much time basically.. I’m not sure that I need a dedicated marketer/partner for that.
  6. It’s also good to to it on my own, since I know my auditory better and it would allow me to focus on a product/my clients needs better. I also know about failed strategies. So I know not only people who need my stuff, but also who don’t need.
  7. Without content — a group is empty. But I hate content, to be honest. It looks like communication surrogate. But it is what it is. It seems people won’t stay without it. At least that what I can extrapolate by 9 cases.

Picasso, Glide, or Fresco

I made a test with loading ~100mb of images (32 individual bitmaps) from /res/drawable-xxhdpi folder. I didn’t use imageView.setImageResource cause it loads images synchronously and blocks the UI thread. Overall, it takes ~50 minutes on Galaxy S3 to load everything. So the results:

  1. Fresco loads images also synchronously. Probably, it uses imageView.setImageResource under the hood.
  2. Glide is better. But it’s prefetching algorithm is somewhat different from what I expect. It seems like I’m always scrolling an empty RecyclerView, instead of seeing images.
  3. Picasso worked the best:
    1. Async image loading
    2. Less amount of code.
    3. Similarly to Glide, there’re no non-standard UI components (in comparison with Fresco).
    4. Nice loading prioritisation. In Splash activity I start prefetching, then, after ~5 seconds, on my RecyclerView activity I’m loading images with a higher priority. Thus, a user can scroll the list smoothly, while the prefetching is being loaded in the background.

So, finally, I adjusted the caching size to ~80% of free mem. It allowed to cache everything. Everything! So it loads super smoothly, it works super smoothly and it takes almost no code.

NVidia GT1030: how to run TensorFlow on Ubuntu 16.04

I used these guides, though it was a quite long dance, so I didn’t write everything down 🙁
1. https://www.nvidia.com/en-us/data-center/gpu-accelerated-applications/tensorflow/
2. http://www.python36.com/install-tensorflow141-gpu/
3. https://medium.com/@samnco/using-the-nvidia-gt-1030-for-cuda-workloads-on-ubuntu-16-04-4eee72d56791

Key points:
1. Need to install a graphic driver nvidia-375.
2. Don’t reboot.
3. Then need to install CUDA (9.1). It’ll also install nvidia-384, though it doesn’t make a login loop after that.
4. Reboot.
5. Compile TensorFlow from the source code.

The driver and CUDA are deb-based.

Good sign: your video adapter should work. Just installation of drivers (correct) – 90% of success. DO NOT download drivers or cuda from the official site. Only from official repos. See guide aboves for the details.
If have any questions, please, mail me: egslava@gmail.com

FreeMind TODO intro!

Yahoo! 🙂 FreeMind TODO finally goes live! I spent two days to understand basics of Blender video editing and record this video. Finally, I’ve produced the first part of an intro to FreeMind TODO.

I hope somebody finds FreeMind TODO useful. Any feedback is appreciated. If it’s useful, I’m planning to record another one with the actual application description.

And, yes, it doesn’t have an aim of Trello/Coggle/so on. It’s just a quick and easy TODO-generator from HUGE and COMPLEX mindmaps.

 

The github link is also available: https://github.com/egslava/freemind-todo

Normal Mapping Visualisation video

Yesterday I published my first video 🙂 It’s about processes that happen in normal mapping shader.

All sources for that video are available here:

https://github.com/egslava/normal-mapping-demonstration

Stars and likes are very welcome 🙂 It’s my very first video editing experience and I’m glad that I’ve managed to get that done.

ffmpeg: gapless video splitting and concatenation

For instance you have a big video file and want to split it onto several parts all.mpg. Splitting it on 1 sec parts:

ffmpeg -i all.mp4 -ss 0 -t 1 first-1-sec.mp4
ffmpeg -i all.mp4 -ss 1 -t 1 first-2-sec.mp4
ffmpeg -i all.mp4 -ss 2 -t 1 first-3-sec.mp4

After this you can want to check the result by combining these files together in one:

One alternative is:

ffmpeg -i "concat:first-1-sec.mp4|first-2-sec.mp4|first-3-sec.mp4" -c copy blah.mp4

But it works for me not always. So there’s also another one:

  1. Creating files.txt:

    file first-1-sec.mp4
    file first-2-sec.mp4
    file first-3-sec.mp4
  2. Running
    ffmpeg -f concat -i files.txt  blah.mp4

And here we’re!

AndroidStudio bug-report. Can’t use debugger with NDK library

Yes, it’s. If you’re researching about how to use native debugger with Android studio there’re no so many decisions of your problem. You shouldn’t use native libraries. Just integrate your C++ sources into main project – so debugger would work.

I’ve created bug-report so, I hope, Google will close it. But there’s still no any info.

BUG: Android Kotlin dex transformation error

Today I got strange error on HelloWorld-like project:

Error:Execution failed for task ‘:app:transformClassesWithDexForDebug’.
> com.android.build.api.transform.TransformException: java.lang.RuntimeException: com.android.ide.common.process.ProcessException: java.util.concurrent.ExecutionException: com.android.ide.common.process.ProcessException: org.gradle.process.internal.ExecException: Process ‘command ‘/Library/Java/JavaVirtualMachines/jdk1.8.0_25.jdk/Contents/Home/bin/java” finished with non-zero exit value 1

Continue…