Tuesday, February 11, 2014

jCrop Minimal and Maximal Aspect Ratio with Preview Pane

[Jcrop Example]

An example using minimal and maximal aspect ratio and a preview pane.

Repository: https://github.com/aguidrevitch/Jcrop/

Updated jquery.jCrop.js: https://raw.github.com/aguidrevitch/Jcrop/master/js/jquery.Jcrop.js

jCrop itself: http://deepliquid.com/content/Jcrop.html

Monday, October 28, 2013

Are you happy with your current issue management SaaS?

Because I finally do, though it is not a SaaS. At my current workplace, DarwinApps, we've spent two years to finally narrow it down. First, we've tried Pivotal Tracker, then some other services I can't even remember, then Asana for more than a year. We have multiple projects running at once and the most obvious flaw of these is the lack of meaningful task numbering. In a voice meeting, we must navigate to the tracker, search and paste full URL of an issue, which is really painful. Can you remember any of the following URLs ?:

  • https://www.pivotaltracker.com/s/projects/391021/stories/19675435
  • https://app.asana.com/0/168983775190/2827676382156
Then we've reached Jira, what a relief ! The links are finally usable / easy to remember and construct:
  • https://darwinapps.atlassian.net/browse/CLINK-6
And Jira is definitely the best one SaaS, at least for our needs, as it has meaningful task numbering and multiple dimensions to assign tickets to. However, in the end, we've sticked to Redmine, but that's really another story, which I can only tell if you are ready to maintain your own virtual / dedicated server. What I really wonder what SaaS are you using for issue tracking, and whether you are happy with it, and if not - what is the reason ? Please leave a note in the comments, thank you !

Are you happy with your current issue tracking SaaS ?

Thursday, October 3, 2013

Local SSH tunneling for Fabric fabfile

Latest Fabric (1.8 as of now) provides remote_tunnel context manager, which allows to create a tunnel forwarding a locally-visible port to the remote target. But in my case, I needed a reverse schema - to access remote mysql with a perl script. The work is based on paramiko's example with minor changes to keep tunnel work in the background using threading:

import SocketServer, paramiko, threading, select
from fabric.state import *
class ForwardServer (SocketServer.ThreadingTCPServer):
    daemon_threads = True
    allow_reuse_address = True
class Handler (SocketServer.BaseRequestHandler):
    def handle(self):
            chan = self.ssh_transport.open_channel('direct-tcpip',
                                                   (self.chain_host, self.chain_port),
        except Exception, e:
            print('Incoming request to %s:%d failed: %s' % (self.chain_host,
        if chan is None:
            print('Incoming request to %s:%d was rejected by the SSH server.' %
                    (self.chain_host, self.chain_port))
        print('Connected!  Tunnel open %r -> %r -> %r' % (self.request.getpeername(),
                                                            chan.getpeername(), (self.chain_host, self.chain_port)))
        while True:
            r, w, x = select.select([self.request, chan], [], [])
            if self.request in r:
                data = self.request.recv(1024)
                if len(data) == 0:
            if chan in r:
                data = chan.recv(1024)
                if len(data) == 0:
        peername = self.request.getpeername()
        print('Tunnel closed from %r' % (peername,))
def forward_tunnel(local_port, remote_host, remote_port, transport):
    # this is a little convoluted, but lets me configure things for the Handler
    # object.  (SocketServer doesn't give Handlers any way to access the outer
    # server normally.)
    class SubHander (Handler):
        chain_host = remote_host
        chain_port = remote_port
        ssh_transport = transport
    server_thread = threading.Thread(target=ForwardServer(('', local_port), SubHander).serve_forever)
    server_thread.daemon = True
def task():
    forward_tunnel(3307, 'localhost', 3306, connections[env.host].get_transport())
    # now remote mysql is available at localhost:3307

Monday, September 23, 2013

Redmine, procmail + rdm-mailhandler.rb howto

This is a quick solution on how to allow creation of new tickets in Redmine using Procmail and rdm-mailhandler.rb (provided by Redmine). Redmine's Wiki has a good page about processing incoming emails http://www.redmine.org/projects/redmine/wiki/RedmineReceivingEmails, but it lacks procmail's actual rules.

1. Configure Postfix to deliver local to procmail:

root@host# tail /etc/postfix/main.cf
virtual_alias_maps = hash:/etc/postfix/virtual
mailbox_command = /usr/bin/procmail

2. Configure aliases:

root@host# cat /etc/postfix/virtual
@redmine.yourhost.com redmine

3. Update aliases

root@host# newaliases

3. In redmine's home dir put the following procmail.rc:

redmine@host:~$ cat ~/.procmailrc

* ^TO_\s*[^@]+@redmine\.yourhost\.com
* ^TO_\s*\/[^@]+
| /var/www/redmine.yourhost.com/rdm-mailhandler.rb --url https://redmine.yourhost.com/ --key XXXXXXXXXXXXXXXXXXXXX --project $MATCH --tracker bug --allow-override tracker,priority --unknown-user create --no-permission-check --no-account-notice

Now to create new issue in redmine, you can send an email to identifier@redmine.yourhost.com, where `identifier` is redmine's project identifier, just do not forget to replace yourhost.com with your actual hostname.

Wednesday, November 21, 2012

RadioTray cp-1251 support in Ubuntu 12.04

  1. sudo gedit /usr/share/pyshared/radiotray/AudioPlayerGStreamer.py
  2. find self.eventManager.notify(EventManager.SONG_CHANGED, metadata)
  3. before it put:

  metadata['title'] = metadata['title'].encode('latin-1').decode('cp1251').encode('utf8')

Saturday, November 17, 2012

Installing ubuntu 12.04 on SSD OCZ Agility 3

To allow fdisk to properly align partition, use

    fdisk -cu /dev/sda

and use start sector 2048 (it will be default start sector)

Monday, November 12, 2012

jQuery-File-Upload Express.js Middleware

I really like jQuery-File-Upload plugin, but it missed integration with Express.js. I ended up with adapting their node code as a middleware for Express.js. Example Express.js integration:

    var express = require("express"),
        upload = require('jquery-file-upload-middleware');
    var app = express();
    app.configure(function () {
        app.use('/upload', upload({
            uploadDir: __dirname + '/public/uploads',
            uploadUrl: '/uploads/'

This way upload middleware will be tied to /upload path, in the frontend you use /upload as url to upload files:

   <input id="fileupload" type="file" name="files[]" data-url="/upload" multiple>
   <script>$('#fileupload').fileupload({ dataType: 'json' })</script>
Other options and their default values:

    tmpDir: '/tmp',
    maxPostSize: 11000000000, // 11 GB
    minFileSize: 1,
    maxFileSize: 10000000000, // 10 GB
    acceptFileTypes: /.+/i,
    // Files not matched by this regular expression force a download dialog,
    // to prevent executing any scripts in the context of the service domain:
    safeFileTypes: /.(gif|jpe?g|png)$/i,
    imageTypes: /.(gif|jpe?g|png)$/i,
    imageVersions: {
        thumbnail: {
            width: 80,
            height: 80
    accessControl: {
        allowOrigin: '*',
        allowMethods: 'OPTIONS, HEAD, GET, POST, PUT, DELETE'

IMPORTANT:  jquery-file-upload-middleware should be registered before express.bodyParser(), or else upload progress events will not fire.

Get the code