Tuesday, March 19, 2013

Hiding the secondary menu on a single page in Drupal

If you want to hide the secondary menu on a single page in Drupal, this doesn't seem to be possible through the Administration web interface. You can disable all Secondary menus on your website, through the Appearance settings of your team (Appearance -> Settings -> name of your theme), but this is not what we are trying to achieve here.

You can disable the secondary menu on a single page however, by adding a few lines of code to your CSS files. Before you start editing your CSS files, you need to uniquely identify the page on which you want to hide the secondary menu. Drupal by default generates a unique ID for each page as a class ID of the body element in the format "page-<NAME>". You can lookup this value by using the Developer Tools of your browser and checking the class attribute of the body HTML element.



Then add the following lines of code to your CSS file:

.page-<NAME> #submenu{
 display:none;
}

The secondary menu section on a Drupal page is identified with the "submenu" id. The display:none property hides an element, and it will not take up any space. So this code snippet in your CSS file will avoid that the submenu is displayed on the page identified by string value after the "." class selector.

Author: Kristof Lievens

Tuesday, March 12, 2013

Central Administration of TIBCO Enterprise Message Service (EMS)

In the summer of 2012, Tibco launched a new version of its messaging bus called Enterprise Message Service (EMS). TIBCO Enterprise Message Service lets applications consume or publish messages according to the Java Message Service (JMS) API.

One of the new features, is the ability to perform central administration through a standard web browser. In the past you mainly had 3 options to ‘control’ your EMS server:
  • The command line utility tibemsadmin
  • The EMS plugin of Tibco Administrator
  • A tool, like Gems or Hermes, created by a third-party

The Central Administration feature, installed automatically with EMS 7.0, offers you:
  • A web-based graphical user interface for configuring TIBCO EMS servers
  • Centralized configuration, allowing administrators to apply configuration changes across multiple TIBCO Enterprise Message Service servers from a single location
  • Support on Windows, Linux, and Mac platforms

How to get the central administration running on your machine.

After the installation of version 7 of Tibco Enterprise Message Service you need to convert the ’old style’ tibemsd.conf to a JSON (JavaScript Object Notation) file. The text-based tibemsd.conf file is not compliant with the Central Configuration feature and EMS servers started with a tibemsd.conf file cannot be managed using the Central Administration server.

Command: tibemsconf2json.bat -conf source-file.conf -json output-file.json


Note: when ems is configured as windows service, you need to tweak the registry key to change the startup parameter tibemsd.conf to tibemsd.json (HKEY_LOCAL_MACHINE\SYSTEM\ControlSet00X\services\tibemsd\Parameters)

Once converted, you can start your EMS server but now using the json configuration file.

Creating a configuration file

Although not mandatory, you can configure the server using a properties file to hold Central Administration server options.  Example:
com.tibco.emsca.data.dir=c:/tibcoems7/tibco/cfgmgmt/emsca_data
com.tibco.emsca.http.hostport=*:8080

I’ve saved mine as emsca.properties in the EMS_HOME/bin location.

Start the central administration server with the command tibemsca.bat (or tibemsca.sh). By default the server will look for a file called emsca.properties in the current working directory.


By default, the Central Administration server does not automatically configure an SSL connection or requires users to pass login credentials. The Central Administration server uses the same username and password to log into the EMS server as was used to log in to the Central Administration web interface. But as said by default there is no login and so it uses user ‘admin’ with no password.  You’ll have to configure JAAS authentication to make it work with a password. (I’ll leave some room for a next blog post).

Default screen after installation

When typing  a name (e.g. local_ems_instance) without spaces, clicking on create, and passing a url of your EMS server (e.g.  tcp://localhost:7222) you are good to go!


Configuring your EMS server using the web portal.

Author: Günther

Monday, February 18, 2013

Looking up Tasks faster and more efficiently in webMethods

In case you want to locate tasks (that use standard business data) on the Task Engine to which the webMethods Integration Server is connected to, an invoke to built-in service "pub.task.taskclient:searchTasks" is used. On the other hand, when working with indexed business data fields service "pub.task.taskclient:searchTasksIndexed" is called.

From a past experience an invoke of service "pub.task.taskclient:searchTasks" resulted in a long running sql query on the related Database source. This eventually led to a "java.net.SocketTimeoutException: Read timed out" scenario between Integration Server and Task Engine. A result set of 140 active tasks took more than 90 seconds to get returned causing the Integration Server per default to disconnect the ongoing Task Engine query after 60 seconds.

One option is to put on the Integration Server extended setting "WSClientConfig.SocketTimeout" in place and increase the timeout value (in milliseconds).
A better, more elegant and formal solution is to replace invoke "pub.task.taskclient:searchTasks" by  "pub.task.taskclient:searchTasksIndexed" and put boolean businessData to false. The replacement service will take advantage of the built-in index capabilities whereas "pub.task.taskclient:searchTasks" doesn't.

When using service "pub.task.taskclient:searchTasksIndexed" instead of "pub.task.taskclient:searchTasks" a similar result set was returned in a couple of seconds instead of causing an operational problem for the integration.

Author: Johan

Monday, February 4, 2013

Turn Machine Data into Real-time Visibility, Insight and Intelligence

Ever needed to analyze your system? To look what's going on? Always faced the insanaty of huge logs? Then I may have a working solution for you...and yes it is partly free and yes it is cloud based. The magical product is SplunkStorm from splunk.com.

What is SplunkStorm?

From the documentation - http://www.splunk.com/product :

Splunk Enterprise is the platform for machine data. It's the easy, fast and resilient way to collect, analyze and secure the massive streams of machine data generated by all your IT systems and technology infrastructure.
Troubleshoot problems and investigate security incidents in minutes (not hours or days). Monitor your end-to-end infrastructure to avoid service degradation or outages. Gain real-time visibility and critical insights into customer experience, transactions and behavior. Make your data accessible, usable and valuable to everyone.



How to get started?

  • First create an account
  • Next add your first project
  • Choose the plan you wish to use, in this case I can live with the Free plan1Gb storage.

  • SplunkStorm Main Dashboard

What's next? 

Lets import our first logs file. For the purpose of this post I only use a file based log. Forwarders seems a great approach but this is too far for an introduction.

Press the File menu item :
Upload a log file. I use the log file from an Oracle Service Bus installation running on top of Weblogic.
 

After pressing the upload button, the Splunk magic is started. Splunk starts parsing the log file, extracting the log based on the timestamps.

Viewing the data


Go to the Project home, then press explore data.
 
 




 

 


Quickly, as soon as SplunkStorm has finished indexing your log files, you can drill down issue, follow what's going on, ...


Not Cloud minded?

Splunk also has a local installer which can be installed on the different platforms Linux, Mac, Windows, ...  Should I have more time, I 'll drill further into the reporting capacities of this tool in future posts.

Sources

Splunk.com
Splunkstorm.com
Doc: https://www.splunkstorm.com/storm/support
Tutorial : http://docs.splunk.com/Documentation/Storm/latest/User/WelcometotheStormtutorial

Author : A.Reper

Tuesday, January 8, 2013

DataPower virtual environment

At I8C several consultants are working with Datapower. We were pleased with the announcement of the virtual datapower. The virtual datapower that is released is the WebSphere Datapower Service Gateway XG45 Virtual Edition for Non Production Environment Version 5.0.0.
 
 The supported VMWare hypervisors are:
  • VMware ESX, version 4.0 or version 4.1
  • VMware ESXi, version 4.0 or version 4.1
  • VMware vSphere editions: Standard, Enterprise, and Enterprise Plus, version 5.0 or version 5.1
  • VMware vSphere Hypervisor, version 5.0 or version 5.1
Although it is not meant for production it is a great opportunity for us to educate our consultants, create some scripts for it, develop on it, create demos, ...

The deployment is very simple.

First we downloaded the ova file (xg5000.tam61_nonpd_vmware.ova), with it came also a release kit. The ova file is less then 700 MB.

This ova file was then deployed to our esx server by using the wizard. The wizard is very straightforward.


After the wizard is completed the esx client started to deploy the virtual appliance.


One coffee later the appliance is deployed. 


We could then start it, go to the console and start our first session. The default user and password is admin/admin. The initial steps are the same as on a regular datapower. See Setting up the initial firmware configuration in the Datapower SOA Appliances V5.0.0 Infocenter.


Because we used dhcp for our datapower we needed to know the ip address. At the console you can find the ip address by issuing the commando show ethernet.


Afterwards we where able to go to the webgui.
 


Now the fun can begin.

Author: Jef Jansen
 
 
 

 










 

Friday, January 4, 2013

SAP Netweaver Cloud Connectivity Service

Cloud.pngSAP Netweaver Cloud is the new Platform-As-A-Service offering of SAP. It fully supports the Java EE 6 Web Profile etc. This means no limitations on what libraries to use etc. This shows again that SAP likes ABAP most as a programming language, but Java is also a very good friend.

VirgoSAP Netweaver Cloud is based on OSGI whereby SAP has chosen for the Virgo container. Good introduction is available in an article on InfoQ.
 
 
Note: pricing information on SAP's PAAS offering is not (yet)(publicly) available.
 
One of the challenges for cloud applications is how to integrate with on-premise applications. SAP Netweaver Cloud comes with the SAP Netweaver Cloud Connectivity service to allow the applications on the SAP Netweaver Cloud to communicate with on-premise applications. The SAP Cloud Connector is installed locally and makes SSL/TLS connections to the SAP Netweaver Cloud from behind the corporate firewall. The bi-directional TCP/IP connections are used to invoke services of on-premise applications over HTTP (using http://hc.apache.org API).

Secure Data Connector ComponentsThe SAP Cloud Connector needs to be installed on a SUSE Linux server (SUSE Linux 11 SP1 or SP2). Setup is somewhat similar to the Secure Data Connector of Google. Google also requires the use of Linux, but leaves freedom on what Linux distro to use.

Congrats to SAP for focusing on (Java) standards for their cloud offering. But it is a pity that each cloud solution comes with its own cloud connector. It would be so much nicer is a single cloud connector/adapter could be used for many cloud platforms!
 
For 2013 we (i8c) will do more testing of PAAS Connectivity options.
 
Author: Guy

Saturday, December 29, 2012

Best wishes!

For application integration, things look very promising according to Gartner: "organizations will spend 33% more on application integration in 2016 than they will in 2013".  That's the main message of Gartner's report "Predicts 2013: Application Integration".
 
Stephanie Mann of Techtarget reports on that same message being communicated by Gartner at their AADI conference, some interesting snippets from her article:
  • "What you want me to say is that cloud APIs [application programming interfaces] are solving your problems, and that REST is the answer," Lheureux told his audience. "But we're not just automating the process and sending messages. We're looking at actually collaborating more at the process-execution level."
  • "More than 50% of the cost of implementing new systems will be spent on integration in the next five years," Schulman said. "Our architectures are obsolete; the way we approach integration is obsolete; and the way we think about integration development is increasingly obsolete."
Best wishes for 2013!