Wednesday, July 31, 2024

VisualVM on Windows with SDKMAN

 7/31/2024 - I have been using SDKMAN on Windows via Git Bash for some time now and truly like it.  I did however come across an interesting bug/solution when installing VisualVM however. 

Symptom:  When starting VisualVM from command line (i.e. > visualvm), there would be some delay as if the script/program were starting but then I would just get the prompt back after a few seconds and the app would never appear.  

Research:  I assumed the app was starting but failing.  After a small amount of web searching, I found the log file in ~/.visualvm/2.1.8/var/log/messages.log.  Though it gave the following somewhat specific message (below), the root cause seemed elusive. Eventually, I downloaded the latest (2.1.9) VisualVM .zip file manually from the actual web site and followed the installation instructions which are simply to unzip the file and then run 'visualvm.exe'.  This worked but should be noted that it DID start with an "Accept License" dialog that is noted in their troubleshooting guide AND implied by the error message.

WARNING [org.netbeans.core.startup.Main]
java.lang.NoClassDefFoundError: org.graalvm.visualvm.modules.startup.AcceptLicense starting fr
om org.netbeans.MainImpl$BootClassLoader@404b9385 with possible defining loaders null and decl
ared parents ]

Solution: It may be somewhat unconventional and maybe not "guaranteed" in all situations but I eventually just tried copying the 'visualvm.exe' from 2.1.9 into the /bin of version 2.1.8 (installed by sdkman).  This seemed to work and I could now start from git bash via > visualvm.exe

However... for simplicity sake, I don't like having to remember to add the '.exe' at the git bash command line.  So, what I did was rename the original script ( 'visualvm' ) to something like 'visualvm_original'.  Then, I created a new script as shown below.  This worked.

new script: visualvm

#!/bin/sh
visualvm.exe &


Tuesday, July 31, 2018

Maven Plugins and Eclipse

I have long meant to add this blog entry... Java newcomers are often confused by two different plugins that share VERY similar names... I can't even remember which is which based on their names alone:
Originally, the question was "which of these plugins should I use? or both?"  The answer has always been that the Maven Eclipse plugin should be solely sufficient if you are using Eclipse as your IDE with Maven projects.  This plugin duplicated the only feature provided by "the other" plugin, thereby rendering it pretty pointless.  (But, to be fair, "the other" plugin came first and filled a need at the time)

maven-eclipse-plugin

Note:  As of 7/31/208 (maybe as far back as 5/28/2015), this plugin is RETIRED and no longer maintained.

As its website states:
The Maven Eclipse Plugin is used to generate Eclipse IDE files (*.classpath, *.project, *.wtpmodules and the .settings folder) for use with a project.
In other words, if you have a Maven-ized project that you are not yet using with Eclipse, you can use this maven plugin to "pre-generate" Eclipse IDE files so that you can open/use the project in Eclipse.  This is a "one-time" conversion; once the files are generated with the plugin, you should not need to do so again.

Maven Eclipse plugin

This is an actual plugin for Eclipse.  It serves two purposes:
  • Allows you to open Maven projects in Eclipse (and in the process, it does the same thing that maven-eclipse-plugin does - namely, does a one-time create of Eclipse IDE project files)
  • Understands your local Maven installation and provides various features for configuring your local Maven installation. Also, and more importantly, this plugin allows you to run Maven(mvn) commands via your IDE's GUI.  Some of the most common commands are 'mvn clean' and 'mvn install'.
See this dated article for some history regarding m2eclipse.

Wednesday, April 16, 2014

Login Form

Thanks to this site I have discovered some new CSS and the great Font Awesome library.

This form looks good on a dark gradient background with white text.

#yourdiv {
    background: linear-gradient(#474747, #1D1D1D);
    border-radius: 5px;

    font-size: 13px;
    font-family: 'Lato', Calibri, Arial, sans-serif;
    text-shadow: 0 1px 0 rgba(255,255,255,0.8);
} 

It is common to have errors and warnings displayed.
Red errors and yellow warnings look good like this:

#insideyourdiv.error {
    color: #C00000;
    background: url("error.png"), url("error.png") scroll 0.5em 50% #FFF6DF;
    background-repeat: no-repeat, no-repeat;
    background-position: left top, right top;
    border: 1px solid #C00000;
}

#insideyourdiv.warning {
    color: #514721;
    background: url("warning.png"), url("warning.png") scroll 0.5em 50% #FFF6DF;
    background-repeat: no-repeat, no-repeat;
    background-position: left top, right top;
    border: 1px solid #FFB900;
} 


Friday, April 11, 2014

ExtJS Rant

First, let me say that I have very few negative comments regarding ExtJS/Sencha. I use it and consider it the premier* client-side javascript framework for web applications.

* I know there are many alternatives out there and am not trying to incite a holy war. This is simply my opinion: swayed by many things, not the least of which is the fact that Java is my "native" language. 

However, I finally came across something that just simply irked me. Simply, there is an AJAX/JSON Reader property on the proxy object named 'messageProperty'. One would think that this could be used somewhere/somehow. As near as I can tell, it is virtually useless.

Here is some pretty boilerplate sample code from the common use case of programmatically synchronizing the Store in response to a Button click:

someStore = Ext.create('Ext.data.Store', {
    model: 'someModel'
    ,autoLoad: true // set to true for production
    ,autoSync: false // set to false since we are submitting the form via AJAX, in lieu of using store syncing
    ,sorters: [ { root: 'data' ,property: 'id' ,direction: 'DESC' } ] // ASC | DESC
    ,proxy: {
        type: 'ajax'
        ,listeners: { exception: aCommonExceptionListener }
        ,reader: { type: 'json', root: 'data', successProperty: 'success', messageProperty: 'message' }
        ,writer: { type: 'json', root: 'data', encode: true, allowSingle: false }
        ,api: { 
            read: 'aRestUrl/view' 
            ,update: 'aRestUrl/update' 
        }
    }
});

namespace.doSaveChanges = function() {

    var options = {
        callback: function() { }
        ,success: function() {
            a.LoadMask.hide();            
            Ext.Msg.show( { title: 'Success'
                ,msg: 'Changes were Saved'
                ,buttons: Ext.Msg.OK
                ,icon: Ext.Msg.INFO
            } );
            Ext.Msg.setY(25);                
        }
        ,failure: function(batch, options) { 
            a.LoadMask.hide();            
            Ext.Msg.show( { title: 'Error'
                ,msg: batch.proxy.reader.rawData.message
                ,buttons: Ext.Msg.OK
                ,icon: Ext.Msg.ERROR
            } );
            Ext.Msg.setY(25);
        }
    };
    yourStore.sync(options);
}; 

So... notice the message property defined in the Store's Proxy.  Based on the similar property 'successProperty', I think it is safe to assume that the Proxy will make the JSON object with this name available to your code via a getter [like getMessage() !!!].  Nope,... you will notice that I have to navigate the batch.proxy.reader.rawData.message property manually.  AND... there is no abstraction of 'message' at all at this point... You are forced to use the exact message property that you send!

If I am missing something, please educate me.  Please.  But until then... total fail on this on fellas.

p.s. This is totally unrelated to this rant but I wanted to capture it.  I am using Spring WebMVC and, though this is documented by sencha, it is not exactly obvious.  It is important that you configure your JSON writer with this...

root: 'data', encode: true

to ensure that Spring will put the JSON into the 'data' parameter on the request.

@RequestMapping(value="aRestUrl/update", method=RequestMethod.POST)
public void update(HttpServletRequest request, HttpServletResponse response) throws IOException {

    final UserRequestContext ctx = this.createRequestContext(request, null);
    final String data = request.getParameter("data");        
    
    final ServiceResponse sr = this.getService().updateSpecialQuestions(ctx, data);
    final ExtJSAJAX model = new ExtJSAJAX(sr);       
    
    this.getJSON(response, model);
}    

Wednesday, April 9, 2014

ExtJS Button - Confirm when using HREF

ExtJS4 buttons (Ext.button.Button) have the properties 'href' and 'hrefTarget' to allow a button to act more naturally like a hyperlink.  However, I had the need to display a confirmation dialog (under certain conditions) before navigating away from the page.  This code feels a bit hacky but it works. The most pertinent thing to mention is that the 'allowDefault' property is not an ExtJS property so by using it, it _might_ be subject to a conflict in later versions of ExtJS.  Constructive feedback and questions are welcome.

This is the relevant stack:
ExtJS 4.2

Simply install this function as the click listener (not to be confused with the 'handler' property, which may work but is untested by me):
var button = Ext.create('Ext.button.Button', {   
    text: 'Back to ...'
    ,iconCls: 'clsActionBack'
    ,disabled: false
    ,href: 'href-goes-here'
    ,hrefTarget: '_self'
    ,listeners: { click: the_following_function }
    ,handler: function() { }
    ,scope: this
});


var the_following_function = function(btn, e, eOpts) {        
    if (btn.allowDefault) {
        // nothing special to do; will fall thru to return true
    } else {
        var isdirty = some_boolean_logic_or_function_here();
        if (isdirty) {
            e.preventDefault();

            Ext.Msg.confirm( 'Confirm'
                ,'Do you want to leave this page without saving your changes?'
                ,function(id, value) { 
                    if (id === 'yes') {                        
                        btn.allowDefault = true;
                        btn.btnEl.dom.click();
                    }
                }
            );
            Ext.Msg.setY(25); // (optional) reposition the msgbox

            return false;                 
        }
    }
    
    return true;
};

Thursday, June 27, 2013

Dual Boot (Windows Vista + Linux Mint 15)

So... I know Windows Vista is a less than desirable Windows OS but... its the latest one I have paid for and... my old dilapidated PC had a motherboard go bad. So, long story short, I built a new computer without any pre-installed OS and wanted to do a dual boot system since I have been recently reintroduced into the Linux world and absolutely love the Linux Mint distro.

Here's my "stack":

  • Hardware: 
    • ASRock Z87 Extreme4 motherboard 
    • Intel i5 Processor
    • 120GB SSD (Samsung) 
    • 8GB RAM (no video card initially)* 
  • OS: 
    • Windows Vista - 32bit 
    • Linux Mint 15 (Olivia) - 32bit** 
  • SSD Partitions:*** 
    • (1) 60GB NTFS 
    • (2) 30GB extended 
    • (2a) 8GB swap 
    • (2b) 10GB ext3 (mounted as /) 
    • (2c) 12GB ext4 (mounted as /opt) 
    • (2d) 4GB fat32 (just because) 
    • (3) 15GB NTFS (used to "share" between linux and windows) 
    • (4) 15GB xfs (mounted as /home)
* I will update this post when I install the video cards
** I did try to install the 64bit edition but ran into problems that I didn't have the time nor knowledge to solve.
*** Probably the most important info is that it is all on the same drive and I used extended/logical partitions. The sizes of the partions and the filesystems chosen were experimental.

Step 1 - Install Windows Vista into partition 1 

This went flawlessly using the install DVD. Other than having to sit through the three to four years of updates and reboots, it went fine. (FWIW, this used about half of my 60GB partition! I didn't expect it to be quite that large and just reaffirms my growing grumpiness with MS.)

Step 2 - Install Linux Mint 15 from LiveCD 

This also went flawlessly.

Step 3 - oops! Intermittent problems booting Vista from Grub 

Vista would always boot but 50% of the time it had one or more of the following bad symptoms: (1) Video was distorted (2) USB Keyboard would not work (3) USB Mouse would not work

Step 4 - the fix 

I won't go into all the google research and options I tried but here is what finally seems to work very well!
  1. Repair Vista MBR
    Used Vista install DVD to repair the MBR (see http://arstechnica.com/civis/viewtopic.php?f=16&t=158228); do not reinstall/enable grub at this point 
  2. Reboot
    Now, Vista boots great (but grub is gone so you can't yet boot into your previously working Mint installation) 
  3. Use EasyBCD
    Under Vista, use EasyBCD (just google it and install) to "Add a new entry". Choose: 
    1. Linux (and whatever your Mint root partition is) 
    2. Grub2 
  4. Reboot
    Your system now uses Windows Boot Manager natively then lets you use grub. This is backward from most install instructions you'll find for dual/multi boot systems but I don't care as long as it works :) 
Feel free to post comments though I am personally most interested if you have solutions that make GRUB work or know of why I might be having problems with the 64bit edition of Mint.

Wednesday, May 29, 2013

SSL + PostgreSQL + Apache BasicDataSource ( + Spring )

So... this may not be completely advisable for a production environment, but if you are using this stack, here is a snippet to show you how to get SSL connections working with PostgreSQL. It can surely be extrapolated for use in a production environment. The "connectionProperties" is the important piece though the other common properties are included for reference:
Stack:
  • PostgreSQL 
    • postgresql-9.0-801.jdbc4.jar 
  • Apache Commons 
    • commons-dbcp-1.4.jar 
  • Spring 
    • spring-xxx-3.0.5.RELEASE.jar 

<b:bean class="org.apache.commons.dbcp.BasicDataSource" destroy-method="close" id="yourid">
    <b:property name="url" value="${yourprop.url}"/>
    <b:property name="username" value="${yourprop.username}"/>
    <b:property name="password" value="${yourprop.password}"/>
    <b:property name="driverClassName" value="${yourprop.driverClassName}"/>
    <b:property name="connectionProperties" value="ssl=true;sslfactory=org.postgresql.ssl.NonValidatingFactory;">

Tuesday, April 23, 2013

Spring (3.0.5) + Transactions + postgreSQL/GreenPlum

The long and the short of it is that I am getting an exception when using declarative transactions with Spring + postgreSQL (GreenPlum variant): org.postgresql.util.PSQLException: Cannot change transaction read-only property in the middle of a transaction.

My stack is:
Java 1.7.0_17
Tomcat 7.0
Spring 3.0.5*
Apache DBCP 1.4
postgresql-9.0-801.jdbc4.jar
GreenPlum 4.2.2.4 (Commercial postgreSQL variant)
* I get this issue whether using declarative OR programmatic tx mgmt

My error is:
DEBUG o.s.j.d.DataSourceTransactionManager - Initiating transaction rollback
DEBUG o.s.j.d.DataSourceTransactionManager - Rolling back JDBC transaction on Connection [jdbc:postgresql://10.110.62.245/data, UserName=app_reports, PostgreSQL Native Driver]
DEBUG org.mybatis.spring.SqlSessionUtils - Transaction synchronization ended with unknown status for SqlSession [org.apache.ibatis.session.defaults.DefaultSqlSession@2db1692e]
DEBUG org.mybatis.spring.SqlSessionUtils - Transaction synchronization closing SqlSession [org.apache.ibatis.session.defaults.DefaultSqlSession@2db1692e]
DEBUG o.s.jdbc.datasource.DataSourceUtils - Resetting read-only flag of JDBC Connection [jdbc:postgresql://10.110.62.245/data, UserName=app_reports, PostgreSQL Native Driver]
DEBUG o.s.jdbc.datasource.DataSourceUtils - Could not reset JDBC Connection after transaction
org.postgresql.util.PSQLException: Cannot change transaction read-only property in the middle of a transaction.
 at org.postgresql.jdbc2.AbstractJdbc2Connection.setReadOnly(AbstractJdbc2Connection.java:617) ~[postgresql-9.0-801.jdbc4.jar:na]
 at org.apache.commons.dbcp.DelegatingConnection.setReadOnly(DelegatingConnection.java:377) ~[commons-dbcp-1.4.jar:1.4]
 at org.apache.commons.dbcp.PoolingDataSource$PoolGuardConnectionWrapper.setReadOnly(PoolingDataSource.java:338) ~[commons-dbcp-1.4.jar:1.4]
I will update this post with the pertinent configs/code but simply, I am using a VERY basic code example and ALSO there are numerous hits on google with the exact same issue. I highly suspect it is isolated to GreenPlum but may also be isolated to GreenPlum+DBCP. The problem is, none of the web posts have a solution... if you do, please comment on my post! :) Thanks

Saturday, January 9, 2010

Multiple MySQL on Vista

Even though I don't have it fully working yet, here are the steps that I have taken:
  • Copy the installed data directory (with a new name obviously)
  • Copy the installed my.ini to new.ini and edit accordingly (see notes below)
  • Copy the installed innodb directory (with a new name obviously)
  • Install a new service to start the new instance (this was trickier than I thought but see notes below)
Edits to my.ini

Install a New Service via Command Line
There were two things that I came across:
  • When running the console, you must 'Run as Administrator' which is accessible if you right-click the shortcut for the console.
  • The syntax for adding the mysql executable with the correct options was tricky.  Basically, you must escape the double-quotes with a backslash as in the following example.  
  • Note that apparently the caret ^ is an escape character for most other command line characters.
  • Note also that the space after binPath= is apparently required.
sc create "MySQL whatever" binPath= "\"c:\yourpath\mysql\bin\mysqld-nt\" --defaults-file=\"c:\yourpath\mysql\new.ini\" MySQL"

Wednesday, August 19, 2009

LightUML

Recently, I installed the LightUML plug-in for Eclipse. The installation "process" was easy and straightforward but when I tried to create class diagrams I encountered error dialogs from Eclipse. So, after a little web research, it basically required a couple of "configuration" particulars. Some of my notes are below, but basically I decided to tell LightUML to use the UMLGraph.jar from version 4.8 (instead of the 5.2 that I had installed). As far as I know (when I wrote this), doing so did not "mess up" my UMLGraph 5.2 installation; the configuration change(s) are isolated to the LightUML plug-in.

Here are my notes:
My LightUML Install Notes : This worked for me
==============================================

=====Environment=============
Windows
Eclipse v3.4.1
GraphViz v
UMLGraph v4.8/v5.2 (see notes below)

=====Eclipse Plugin Settings==============
Preferences > Java > LightUML
* Graph file name: graph
* Output directory (...optional): src/main/lightuml
* Use package or project name as the graph file name: true
* Recurse into subpackages: true
* Javadoc executable path (optional): C:\Program Files\Java\jdk1.5.0_16\bin\javadoc.exe

Preferences > Java > LightUML > Class Diagrams > General
* attributes
* constructors
* operations
* UMLGraph extra command line parameters (optional): -outputencoding UTF-8

Preferences > Java > LightUML > Dot and Pic2plot
* Extra lookup path (optional): C:\develop\tools\graphviz\v2.24\bin
* Graphics format: png


Preferences > Java > LightUML > UML Graph
* UmlGraph.jar path: C:\develop\tools\umlgraph\UMLGraph-4.8\lib\UmlGraph.jar
* sequence.pic path: (empty) [but probably needs to be something if I try to use sequence diagrams]
* UMLGraph version 4.4+

=====Eclipse Plugin modifications============ (path...)/workspace/.metadata/.plugins/org.lightuml.core/ Change 1 : < file: build.xml > Description: For some reason, the ant script thinks that graphviz is failing even though it is just spitting out warnings about fonts. I commented out the <fail> tag in the Ant script and it allowed it to proceed. That is probably a workaround for telling Ant how to properly recognize/ignore the warnings.
<target name="dot-to-graphics">
<!-- load setting for this run (the dot-file-name) -->
<property file="runsettings.ini" />

<property environment="env"/>
<exec
executable="dot"
searchpath="true"
errorproperty="dot.error">

<env key="PATH" path="${extra-lookup-path}:${env.PATH}"/>
<arg line="${dot-extra-param}"/>
<arg value="-T${graphics-format}"/>
<arg value="-ograph/${dot-file-name}.${graphics-format}"/>
<arg value="graph/${dot-file-name}.dot"/>
</exec>
<!-- remove the dot file -->
<delete file="graph/${dot-file-name}.dot" />

<!-- =================================
<fail message="Error executing Graphviz 'dot' ::: ${dot.error}">
<condition>
<length string="${dot.error}" when="greater" length="0"/>
</condition>
</fail>
================================= -->
</target>

Change 2: Description: As found on a sourceforge bug report, make all occurrances of 'useexternalfile' look like the following snippet; *At this time, I honestly do not know what this does.
useexternalfile="no"

Monday, July 13, 2009

Where are my iBatis log messages?

I had setup my webapps and Tomcat to use log4j according to their very own documentation [insert-link-here] but my webapps were not logging iBatis log messages nor the java.sql messages that it uses.

Short story first:
Because of what must obviously be classloader issues(as discussed ad nauseum on the web), I moved the following files from ...tomcat/common/lib to .../tomcat/server/lib:
  • commons-logging-1.1.1.jar
  • log4j-1.2.8.jar
Now, everything seems to work fine yet I only found ONE(1) web site which gave this advice (and it was not directly related to missing log entries). Hope it helps someone else since I spent quite a bit of time perusing the web.

Any feedback is welcome (of a constructive sort), but here are the key points I will take away from this lesson:
  • I think I would recommend this for any web application setup that uses JCL (either directly or through a dependent JAR).
  • Tomcat logging is working correctly via a log4j.properties file in .../tomcat/server/classes, and all webapps seem to be logging correctly as configured in their web-inf/classes/log4j.xml file

VisualVM on Windows with SDKMAN

 7/31/2024 - I have been using SDKMAN on Windows via Git Bash for some time now and truly like it.  I did however come across an interesting...