Skip to main content

Ryzen 5 1600 first thoughts

Spend one day to install and modify my new PC. I selected a Ryzen 5 1600 with 16 GB of RAM with an ASRock AB350 PRO4. I also installed a 850 SSD to speedup the IO. First impressions: really fast. The Linux part (Ubuntu 16.04) was mature enough, but the windows was not working out of the box. The only downside was that i had to update the BIOS of the motherboard to 2.5 to make the DDR4 to work in 2400 Mhz. The second issue was that i had to use the AMD pro drivers for my RX 460 to use the full potential of OpenCL.



Played a bit with XMR (monero) mining. Despite what people are saying for bad performance in mining, i discovered 400-430 H/s with 7 threads and huge pages.





Popular posts from this blog

Processing Milky Way in RawTherapee

This text is an analysis of a video I did some months ago how to process photos of our Milky Way in RawTherapee. You can find the picture here . The photo was taken by wiegemalt. This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Editing: Step 1: Fixing lighting The first thing someone notices when opening the picture is the extreme noise due to high ISO used (1600 - check the picture on the right). The first thought is to de-noise the picture, however, if you do that you are going to loose the details of the night sky. The main reason for the high noise is the additional exposure Rawtherapee adds when the 'Auto' button is pressed. In particular, the RT adds +2.4 EV to equalize the picture. This is Wrong! What we want is to keep the noise down and at the same time bring the stars up. That's why we are going to play with the Tone Curve of the RT. To adjust the light properly we increase the cont

Static linking with gcc and g++

In this tutorial, we will explain what the static linking is, how this affect the size of final binary, and why statically linking with g++ sometimes is pain. By definition, a statically compiled binary is a group of programmer ‘s routines, external functions, and variables which are packed into the final binary executable. The compiler or the linker produces the final object and embeds all the functions and variables and the linking phase.  There are two reasons of using dynamic linking and shared libraries: 1) Avoid creating a huge binary, if all the programs use a standard set of libraries why not having the operating system providing to them 2) Compatibility on operating system and machine dependant characteristics: sometimes the libraries must be implemented based on the architecture or the operating system and using dynamic linking is an easy way to avoid this catch. On the other hand, static linking is the ideal way of distributing one software product, paying

Homemade NAS after two and half years

Few years ago, I created a NAS system using my old PC. The old pc is powerful enough to do the basic NAS operation except real time transcoding of 4K or more than one 1080 streams.  The setup was setup as follows: 1 SSD 128 GB for boot device 2x 4 TBs for the main storage configured as RAID 1 The hard disks were from different manufacturers (WD red plus and Segate) to avoid catastrophic situations: disks are part of a bad batch and you will have a second failed disk during the build of the RAID array. Here is a list of my mistakes: My first mistake was CentOS. I wanted a stable machine without surprises, unexpected behavior, as much less bugs affecting my setup. I had three free/open source options:  Open Media Vault: Debian based Linux with many configuration options. The main downside is that you don't have support for ZFS and you have to very often use the command line to fix things. Moreover, there are a plethora of open issues/bugs that can make your life harder than expected