Quantcast
Channel: ATeam Chronicles
Viewing all 987 articles
Browse latest View live

SaaS – Enterprise Resource Planning – Financials


SaaS – Customer Experience – CPQ

SaaS – Customer Experience – Commerce

$
0
0
  • Commerce

    SaaS – Customer Experience – Service

    SaaS – Customer Experience – Sales

    $
    0
    0

    SaaS – Customer Experience – Marketing

    Executing a Stored Procedure from Oracle Business Intelligence Cloud Service (BICS)

    $
    0
    0

    Introduction

     

    This article describes how to configure a link on an Oracle Business Intelligence Cloud Service (BICS) Dashboard that allows a BICS consumer to execute a stored procedure from BICS Dashboard. In this particular example the stored procedure inserts a record into table located in the Oracle Schema Service. However, the provided steps can also be applied to a BICS DbaaS environment.

    The final Dashboard created in this article is displayed below. The BICS consumer clicks the refresh link, that calls a database function (using EVALUATE), which in turn executes the stored procedure. The stored procedure contains the logic to update / refresh the given table. The example provided inserts a single row into a one column table. However, this solution can be easily modified for much more complex use cases.

    Snap25

     

    The article is divided into seven steps:

     

    Step One: Create Table (to store data)

    Step Two: Create Stored Procedure (to load data)

    Step Three: Create Function (to execute Stored Procedure)

    Step Four: Create Dummy Table (to reference the EVALUATE function)

    Step Five: Create Expression in Data Modeler (that references EVALUATE function)

    Step Six: Create Analysis (that executes EVALUATE function)

    Step Seven: Create Dashboard (to display Analysis)

    Main Article

     

    Step One – Create Table (to store data)

    For a text version of SQL Scripts in Steps One, Two, and Three click here

    Note: All SQL Statements have been run through Apex -> SQL Workshop -> SQL Commands

    Snap4

    Snap5

    1)    Create Table

    CREATE TABLE STORE_DATA
    (DATA_FIELD TIMESTAMP);

    Step Two – Create Stored Procedure (to load data)

    1)    Create Stored Procedure

    CREATE OR REPLACE PROCEDURE LOAD_DATA AS
    BEGIN
    INSERT INTO STORE_DATA (DATA_FIELD)
    VALUES(SYSDATE);
    END;

    2)    Test that executing the stored procedure inserts into the table without errors.

    BEGIN
    LOAD_DATA();
    END;

    3)    Confirm that data loaded as expected.

    SELECT * FROM STORE_DATA;

     Snap6

    Step Three – Create Function (to execute Stored Procedure)

    1)    The Functions main purpose is to execute the stored procedure. At the time of writing it was found that an input

    parameter, PRAGMA AUTONOMOUS_TRANSACTION, and return value was required.

    CREATE OR REPLACE FUNCTION FUNC_LOAD_DATA (
    p_input_value VARCHAR2
    ) RETURN INTEGER
    IS PRAGMA AUTONOMOUS_TRANSACTION;
    BEGIN
    LOAD_DATA();
    COMMIT;
    RETURN 1;
    END;

    2)    Confirm that the function can be ran successfully.

    SELECT FUNC_LOAD_DATA(‘Hello’)
    FROM DUAL;

    Snap7

    3)    Confirm that each time the function is referenced the table is updated.

    SELECT * FROM STORE_DATA;

    Snap10

     

    Step Four – Create Dummy Table (to reference EVALUATE function)

    For a text version of SQL Scripts in Step Four click here

    1)    Create Table

    CREATE TABLE DUMMY_REFRESH
    (REFRESH_TEXT VARCHAR2(255));

    2)    Insert descriptive text into table

    INSERT INTO DUMMY_REFRESH (REFRESH_TEXT)
    VALUES (‘Hit Refresh to Update Data’);

    3)    Confirm insert was successful

    SELECT * FROM DUMMY_REFRESH;

    Snap1

     

     

     

    Step Five: Create Expression in Data Modeler (that references EVALUATE function)

    1)    Lock to Edit the Data Model

    2)    Add the “DUMMY_REFRESH” table as a dimension table.

    Snap11

     Snap12

    3)    Join the DUMMY_REFRESH table to a Fact Table. This does not have to be a true join.

    However, the data types must match.

    Snap13

    4)    Click on DUMMY_REFRESH in the Dimension Table list

    5)    Click on Add Column

    6)    In the Expression Builder type:

    For a text version of the expression click here

    EVALUATE(‘FUNC_LOAD_DATA(%1)’,’Hello’)

    7)    In Name and Description type: Run_Func

    8)    Validate the Expression

    Snap15

    9)    Click Done

    Snap16

    10)   Click Done

    11)   Publish the Model

    Step Six: Create Analysis (that executes EVALUATE function)

    1)    Create a new Analysis

    2)    It may be necessary to Refresh -> Reload Server Metadata in order to see the new expression created in the previous step

    Snap17

    3)    Add both columns

    Snap18

    4)    From the Results tab set the Run_Func column to be hidden

    Snap19

    5)    Remove the Title

    6)    Go to the Column Properties of DUMMY_REFRESH and “Click Custom Headings”.

    Type a space for the heading name.

    Snap21

    7)    Save the  Analysis. It should look something like below.

    Snap22

     

    Step Seven: Create Dashboard (to display Analysis)

    1)    Add the Analysis to a Dashboard

    2)    Set custom Report Links

    Snap23

     

     

    3)    Customize -> Refresh

    Snap24

    4)    The Analysis should look something like the below on the Dashboard

    Snap25

     

    5)    To run the function that updates the data – Hit Refresh.

    6)    Confirm data was updated in the STORE_DATA table.

    SELECT * FROM STORE_DATA;

    7)    For certain use cases it may also be beneficial to display the table that is being updated in a separate Analysis

    on the same dashboard. In this example the STORE_DATA results. This allows the BICS consumer to view the new

    results immediately after they are refreshed.

    Further Reading

    Click here for related A-Team BICS blogs

    Click here for more information on the EVALUATE function. This link is for the “Logical SQL Reference” – for “Oracle® Fusion Middleware Metadata Repository Builder’s Guide for Oracle Business Intelligence Enterprise Edition”. Therefore, not all commands in this guide are applicable to BICS. The relevant section on the EVALUATE function has been provided below. Note for BICS environments EVALUATE_SUPPORT_LEVEL is enabled by default.

    Snap26

    Summary

     

    This article described how to configure a link on an Oracle Business Intelligence Cloud Service (BICS) Dashboard that allows a BICS consumer to execute a stored procedure from BICS Dashboard.

    The key components to the solution are the use of EVALUATE in the Data Modeler and referencing PRAGMA AUTONOMOUS_TRANSACTION in the database function.

    In this example the stored procedure is executed by clicking the refresh link. An alternative approach would be to invoke the stored procedure through a Dashboard Prompt.

    The example shown can be easily modified to:

    1) Pass values from the Dashboard to the Stored Procedure.

    2) Return output values such as “number of records inserted” or “number of records failed”.

    When using this solution to load / refresh data in BICS, it is important to also remember to add logic to clear the cache.

    Copy Data from BICS Schema Service to BICS DBaaS or on On-Premise DB

    $
    0
    0

    Introduction

    This article will cover the steps to migrate the data from a BICS Schema Service Database, to a BICS DBaaS Database.  Similar steps could be used to move a database from BICS to on-premise, or from on-premise to a BICS DBaaS instance.

    The process involves creating a DMP file in BICS that is then downloaded and moved to the BICS DBaaS instance, and then imported with the DATAPUMP tool.

    This has the advantage of not only copying all tables and their data, but also any views, indexes, packages, procedures, functions, triggers, and sequences that had been created in the BICS environment.

     

    Main Article

    Export Data from BICS Schema Service Database

    1. Open the Database Service page by clicking the cloud icon

    Windows7_x86

     

     

     

     

     

     

     

    2. Select the Export Data option in the Administration Menu to open the Exports section.  Within that click ‘Export Data’ and make a note of the Service SFTP Host & Port, and the User Name.  These settings will be required to download the DMP file once it has been created.

    Windows7_x86

    3. Check the option to ‘include data’ and then click ‘Create Data Export’

     

    Windows7_x86

    This will then show a data export has been requested and will show in ‘Requested’ status.

    Windows7_x86

    This status will change to ‘Available’ once the DMP file has been created.  The time for this to be available will depend on the size of the database being exported.

    The name of the DMP file, and its size will also be shown.

    Edit_Post_‹_ATeam_Chronicles_—_WordPress

    4. With a FTP tool such as Filezilla, connect to the SFTP server using the details noted in step 2.  Select the ‘download’ folder on the BICS Server, and locate the DMP file created in the previous step and download locally.

     

    Windows7_x86

     

    Import Data into DBaaS

    The DMP file needs to be copied to the DBaaS server.  This requires the SSH Public Key that was created when the DBaaS instance was set up.  For more information on how to copy the file, see this document and the section Copy the Dump File to the Cloud Database Instance.

    Before running the Data Pump import command, the schemas and tablespaces from the original BICS Schema Service DB, and the target DBaaS DB need to be ascertained.

    Logged into the DBaaS schema where the data will be loaded, run the command below to get the tablespace name.

    select default_tablespace from user_users;

    In the BICS Schema Service SQL Workshop, the same tablespace command can be run, while the schema name is listed in a box at the top as shown below.

     

    Windows7_x86

    Connected to the DBaaS database, set the Oracle SID and HOME, and then as the system user, run this command – substituting the relevant schema and tablespace names.  This will import all the objects from the BICS schema service database into the new DBaaS schema.

    impdp system/SYSTEMPASSWORD SCHEMAS=BICSSCHEMANAME remap_schema=BICSSCHEMANAME:DBAASSCHEMANAME remap_tablespace=BICSTABLESPACENAME:DBAASTABLESPACENAME  directory=datapump dumpfile=YOURDMP.dmp

     

    Similar steps could be used for an on-premise database to either export it using DATAPUMP and upload to the DBaaS database, or to import the DMP file from the BICS Schema Service Database.

     

    Summary

    This article demonstrated a method for migrating data from a BICS Schema Service Database to a BICS DBaaS Database.  The same method could be used for migrating data to or from an On-Premise database.

     

    Further Reading

    Copy DMP file from On-Premise to Cloud DBaaS

    http://www.oracle.com/webfolder/technetwork/tutorials/obe/cloud/dbaas/obe_dbaas_migrating_11g_to_11g_via_data_pump_conventional_exp_imp/index.html#section3

    Another Blog that touches on DBaaS 12C and Private / Public SSH Keys and Copying Data

    https://blogs.oracle.com/dataintegration/entry/odi_12c_and_dbaas_in

     


    Implementing OAuth 2 with Oracle Access Manager OAuth Services (Part IV)

    $
    0
    0

    Introduction

    This post is part IV of a series of posts about OAM’s OAuth implementation.

    Other posts can be found here:

    Part I – explains the proposed architecture and how to enable and configure OAM OAuth Services.

    Part II – describes a Business to Business use-case (2-legged flow);

    Part III  – deals with the Customer to Business use-case (3-legged flow), when the client code is running in the application server;

    Part IV – describes the Customer to Business use-case (3-legged flow), when the client application runs on the browser, embedded as Javascript code;

    Part V  – provides the source code and additional information for the use case implementation.

     

    This post will present the C2B use-case where the Client Application is a Javascript code, running on the browser.

    This is a variation of the C2B use-case presented before; in this case, the client application resides in the browser itself, embedded as javascript code.

    All the API calls are made directly from the browser to the OAuth server, but this presents two problems:

     

    • First, the client credentials would be exposed, hardcoded in the javascript code, when making calls with AJAX, just like this:
      $.ajax({ 
          type: "POST",
          url: "https://oxygen.mycompany.com:14101/ms_oauth/oauth2/endpoints/oauthservice/tokens",
          data: {
              "code":code,
              "grant_type":"authorization_code"
          },
          dataType: "text",
          headers: {
              "Content-Type": "application/x-www-form-urlencoded;charset=UTF-8",
              "Authorization": "Basic d2h5c29jdXJpb3VzOnlvdWFza21l"
          },
          success: function(data){ 
              //handle token
          },
          error: function(data){
              //handle error 
          }
      });

    Note that the client credentials appears in a Base64 encoded format: d2h5c29jdXJpb3VzOnlvdWFza21l, which can be easily decoded and visible to anyone accessing this web page.

     

    • Second, the AJAX call most probably would run into Cross-Domain issues. This is very likely to happen because we’re hosting our application in a web server domain different than the domain where the OAuth server is running. Although there are ways to prevent this, like changing the response headers from the OAuth server, sometimes it will just not be possible.

    To avoid this situation, we are going to implement a ‘proxy’ application or ‘relay endpoint’, that will make the actual calls to the OAuth Server, thus preserving the client credentials and bypassing cross-domains restrictions imposed to AJAX calls in the web browser.

    Use Case Flow

    The following picture shows the flow between the different components

    C2B - Javascript

    1. The webpage that contains the javascript code is loaded in a browser. The embedded javascript code checks if there is an authorization code in the URL parameters.

    2. If it does not find the Authorization Code, it makes a redirection to the OAuth Server Authorization URL. If the user has already been authenticated with OAM, the OAuth Server will not ask him to authenticate again, in this case the flow would proceed to step 5.

    //Gets the Authorization Code
    var code = getURLParameter("code");
    //Not Authorized
    if(typeof code == 'undefined') {
        //Redirecting to Authorization URL
        location.href = "https://oxygen.mycompany.com:14101/ms_oauth/oauth2/endpoints/oauthservice/authorize?response_type=code&client_id=browserClient&redirect_uri=https://oxygen.mycompany.com:7502/ClientWebApp/index.html&scope=Customer.Info%20UserProfile.me&state=getauthz";
        ...

    3. OAuth server replies with the OAM Login Page.

    4. User logs in with its credentials.

    5. OAuth Server checks if the user needs to give consent for the client application to access his resources.

    6. OAuth Server replies with the consent page.

    7. User gives consent for the client application to access his resources.

    8. OAuth Server returns the Authorization Code back to the redirect_uri as an URL parameter.

    9. The javascript code embedded in the webpage extracts the Authorization Code from the URL parameter and makes a call to the Relay Endpoint requesting an Access Token, passing the Authorization Code.

    } else {
        //Gets an Access Token passing Authorization Code using Proxy Application
        $.ajax({ 			
            type: "POST",
            url: "https://oxygen.mycompany.com:7502/ClientWebApp/rest/tokens",
            data: {'code':code},
            dataType: "text",
            success: function(token){
            ...

    10. The Relay Endpoint makes a call to the OAuth Server requesting an Access Token, passing the Authorization Code. Since the code is running server-side, there is no cross-domain access controls imposed by browsers.

    11. The OAuth Server checks the Authorization Code, checks if the grant type is enabled for this client and if its credentials are correct before issuing the Access Token.

    12. OAuth Server replies with the Access Token back to the Relay Endpoint

    13. The Relay Endpoint replies with the Access Token back to the Javascript code embedded in the webpage.

    14. The Javascript code embedded in the webpage makes a REST call to the Resource Server passing the Access Code.

    //Gets the Customer Information passing the Access Token
    $.ajax({ 
        type: "POST",
        url: "https://oxygen.mycompany.com:7502/ResourceService/rest/customer",
        data: {
            "code":accessToken
        },
        dataType: "text",
        headers: {
            "Content-Type": "application/x-www-form-urlencoded;charset=UTF-8" 
        },
        success: function(data){        
            //Handles Data returned from the server
        },
        error: function(data){
            //Handles error message 
        }
    });

    15. The Resource Server requests a token validation with the OAuth Server

    16. The OAuth Server validates the token and sends the response to the Resource Server

    15. The Resource Server sends back application data to the javascript code embedded in the webpage.

    C2B Use Case Implementation

    To implement the C2B use case, a HTML page is used as the Client Application.

    The Relay Endpoint is written in an annotated plain Java class using Jersey to expose it as a RESTful webservice.

    The Resource Service is implemented in a plain Java class using the Jersey framework to expose the Customer Service as a RESTful webservice.

    The Resource Server also implements a Servlet Filter, that intercepts all incoming requests to validate the incoming Access Token before giving access to the REST Endpoint.

    The relevant classes/files for this use case are:

    • index.html
    • TokenProxyService.java
    • UserProfileProxyService.java
    • CustomerService.java
    • OAuthTokenValidationFilter.java
    • OAuthTokenValidator.java

    Using the Oracle Business Intelligence Cloud Service (BICS) REST API to Clear Cached Data

    $
    0
    0

    Introduction

     

    This article describes how to use the Oracle Business Intelligence Cloud Service (BICS) REST API dbcache method to clear the BI Server cached data. Prior to dbcache being added to the BICS REST API, the cache could be cleared by unlocking the Model (in Modeler) and running “Clear All Cached Data”. For DataSync version 1.2 and higher, the BI Server cache is automatically purged at the end of the data load. For all other data loading methods, the BI Server cache must be cleared once the data load is complete. The manual method to unlock and clear the cache through the Modeler is still available and valid. However, with most things ETL/BI related – avoiding manual intervention is generally preferred.

    The article has been divided up into three steps:

    Step One:    Run dbcache in Curl

    Step Two:    Call dbcache from Apex Web Services

    Step Three: Trigger BICS REST API from Analysis – Return status to Dashboard

    Although the title to this article is very cache specific; concepts covered can be applied to many other BICS REST API / Apex Web Services / Stored Procedure use cases.

    For example this article could be used to assist with:

    1)    Running any other BICS REST API commands in curl (or though other methods).

    2)    Running any other BICS REST or non-BICS REST API commands through Apex Web Services.

    3)    Triggering any other Stored Procedures from BICS.

    4)    Returning values from any other Stored Procedure to an Analysis.

    Each step of the article provides an individual solution. It is not necessary to implement all three together. In some use cases only one step may be required.

    The final Dashboard created in this article is displayed below. The BICS consumer clicks the refresh link, that calls a database function (using EVALUATE), which in turn executes a stored procedure. The stored procedure uses apex_web_service.make_rest_request to run the BICS REST API dbcache method that clears the BI Server cached data. The dbcache method returns a status of 200 for successful completion. All other status indicate an error of some sort. For example status 403 indicates an authentication error. The status is returned to the database function / stored procedure and displayed on an Analysis presented on a Dashboard.

    Snap1

    Main Article

    Step One: Run dbcache in Curl

     

    Download curl from here.

    Replace: User Name, Password, Tenant Name, Analytics URL.

    ***  Analytics URL … not Apex URL ***

    curl -i -k -u user:password -X DELETE -H “X-ID-TENANT-NAME: oracletrial123” https://bitrial123-oracletrial123.analytics.us.oraclecloud.com/bimodeler/api/v1/dbcache

    First line should return:

    HTTP/1.1 200 OK

    Step Two: Call dbcache from Apex Web Services

     

    It is recommended to review Executing a Stored Procedure from Oracle Business Intelligence Cloud Service (BICS) prior to reading this section.

     

    For a text version of SQL Scripts click here

    All SQL Statements have been run through Apex -> SQL Workshop -> SQL Commands

    Snap4

    Snap5

    1)    Create Stored Procedure.

    Replace: User Name, Password, Tenant Name, Analytics URL.

    CREATE OR REPLACE PROCEDURE CLEAR_CACHE(p_status OUT VARCHAR2) IS
    l_ws_response_clob CLOB;
    l_ws_url VARCHAR2(500) := ‘https://bitrial123-oracletrial123.analytics.us.oraclecloud.com/bimodeler/api/v1/dbcache';
    BEGIN
    apex_web_service.g_request_headers(1).name := ‘X-ID-TENANT-NAME';
    apex_web_service.g_request_headers(1).Value := ‘usoracletrial123‘;
    l_ws_response_clob :=
    apex_web_service.make_rest_request
    (
    p_url => l_ws_url,
    p_http_method => ‘DELETE’,
    p_username => ‘UserName‘,
    p_password => ‘Pwd
    );
    p_status := apex_web_service.g_status_code;
    dbms_output.put_line(‘Status:’ || dbms_lob.substr(p_status));
    END;

    2)    Test Stored Procedure.

    DECLARE
    p_status VARCHAR2(500);
    BEGIN
    CLEAR_CACHE(p_status);
    END;

    3)    Confirm Stored Procedure runs successfully.

    Status:200

    4)    Create Function.

    CREATE OR REPLACE FUNCTION FUNC_CLEAR_CACHE (
    p_input_value VARCHAR2
    ) RETURN VARCHAR2
    IS PRAGMA AUTONOMOUS_TRANSACTION;
    p_status VARCHAR2(500);
    BEGIN
    CLEAR_CACHE(p_status);
    COMMIT;
    RETURN p_status;
    END;

    5)    Test Function.

    SELECT FUNC_CLEAR_CACHE(‘Hello’)
    FROM DUAL;

    Snap3

    Step Three: Trigger BICS REST API from Analysis – Return status to Dashboard.

     

    It is recommended to review Executing a Stored Procedure from Oracle Business Intelligence Cloud Service (BICS) prior to reading this section.

    1)    In the database create a dummy table that will be used to define the function.

    The table should have one column and one row.

    Populate table with descriptive text.

    CREATE TABLE DUMMY_REFRESH
    (REFRESH_TEXT VARCHAR2(255));

    INSERT INTO DUMMY_REFRESH (REFRESH_TEXT)
    VALUES (‘Dummy Table Used to Clear Cache’);

    2)    Go to Modeler -> Lock to Edit the Data Model

    3)    Add the dummy table as a Dimension table.

    4)    Join the dummy table to a Fact Table. This does not have to be a true join. However, the data types must match.

    5)    Click on the dummy table in the Dimension Table list

    6)    Click on Add Column

    7)    In the Expression Builder type:

    EVALUATE(‘FUNC_CLEAR_CACHE(%1)’,’Hello’)

    Snap4

    8)    In Name and Description type: Run_Func

    9)    Validate the Expression

    10)  Click -> Done -> Done -> Publish the Model

    11)  Create a new Analysis

    12)  It may be necessary to Refresh -> Reload Server Metadata in order to see the new expression created in the previous step

    Snap17

    13)  Add both columns

    Snap18

    14)  Edit the Run_Func Column to contain the following CASE statement.

    Rename the column to “Cache Status”.

    CASE WHEN “DUMMY_REFRESH”.”Run_Func” = ‘200’ THEN ‘Successful’ ELSE ‘Failed with Status: ‘ ||  “DUMMY_REFRESH”.”Run_Func” END

    Snap6

    15)  Hide the descriptive text field.

    16)  Test the Analysis runs successfully.

    Snap7

     

    17)  Add the Analysis to a Dashboard

    18)  Set custom Report Links

    Snap23

     

     

    19)  Customize -> Refresh

    Snap24

    20)  Final Dashboard

    Snap1

     

    21)  Experiment by changing the username or password in the Stored Procedure to be incorrect. Confirm it fails and the status is returned to the Dashboard.

    Snap2

    Further Reading

     

    Click here for related A-Team BICS blogs

    Click here for the Application Express API Reference Guide –  MAKE_REST_REQUEST Function

    Click here for the REST API Reference for Oracle Business Intelligence Cloud Service

    Click here for more information on the EVALUATE function. This link is for the “Logical SQL Reference” – for “Oracle® Fusion Middleware Metadata Repository Builder’s Guide for Oracle Business Intelligence Enterprise Edition”. Therefore, not all commands in this guide are applicable to BICS. The relevant section on the EVALUATE function has been provided below. For BICS environments EVALUATE_SUPPORT_LEVEL is enabled by default.

    Snap26

    Summary

    This article described how to use the Oracle Business Intelligence Cloud Service (BICS) REST API dbcache method to clear the BI Server cached data.

    Upon completion of the article the reader would have learned how to:

    1)    Use the BICS REST API dbcache method to clear the BI Server Cache.

    2)    Call the BICS REST API through the Apex API using the MAKE_REST_REQUEST function.

    3)    Use EVALUATE in the Data Modeler to execute a database function.

    4)    Trigger a database function from a BICS Analysis – displaying return values on a Dashboard.

    Documents Cloud and Atlassian Confluence: A Macro for Attachments

    $
    0
    0

    Oracle Documents Cloud Service and Atlassian Confluence both use the term “Application Link” but with different meanings. The Confluence definition is related to connecting two different systems together for integration, whereby authentication settings and connection details are kept securely. In Documents Cloud, the Application Link REST resource definition refers to an embedded user interface that can be added into other applications. To make the terminology a bit further confusing, both products use the term “AppLinks” as a short name. But these two features work well together for integrating Documents Cloud into Confluence as an attachment storage feature. Using a Documents Cloud macro for attachments provides a HTML5 user experience with drag and drop capabilities for adding and managing files. The standard Confluence attachments macro allows content to be stored within Confluence, but putting content in the Oracle cloud allows for further possibilities with those files.

    Creating a Confluence macro skeleton can be done by following tutorials that illustrate the steps. The Atlassian SDK is required in the development environment. This macro was tested on Confluence versions 5.4.x and higher. Once the project is created, a Documents Cloud macro that uses the embedded user interface can be written into a single Java class. The class must extend the Confluence Macro interface. The macro can be created in a single class and packaged into a jar file that can be uploaded into Confluence. A similar macro could be created for Jira, although that is not explored in this post.

    Adding the macro to the page is like any other installed macro in Confluence. The macro can be selected from the list of installed macros and dropped into the Confluence page.

    image029

     

    The result of the DOCSMacro is shown in the following screen shot. A Confluence page where the macro has been added renders the folder in embedded format, and users cannot navigate above the folder and see other pages’ folder content. An embedded and locked-down view of a single folder in Documents Cloud is presented. Users can interact with the folder using the intuitive interface of Documents Cloud Service. In this sample, all of the Folders are created at root level of Documents Cloud, which is coined the “self” folder in Documents Cloud. The assumption for the sample code is that the Documents Cloud user account is only being used for Confluence. In other words, the account is dedicated for this purpose.

    The function of this “DOCSMacro” follows a pattern that is re-usable in other Documents Cloud “Application Link” integrations. The macro makes only two REST calls to Documents Cloud. If loading the macro for the Confluence page for the first time, a create folder REST call is made. The folder that is created is then saved in a page property (docs_folder_guid), so that the folder is persisted and forever associated with that page. Because of this, the macro has a 1:1 relationship between Confluence Page and Documents Cloud folder. Even if the macro on the page is removed and a re-added, the same folder that was originally created will be loaded again because the Confluence page property will remain.

    Once a folder is created and saved to the page, the macro will call the Documents Cloud AppLink REST service. An embedded iframe is generated with the REST response details and access tokens. JavaScript handling of the “appLinkReady” event is also included in the macro.

     

    image033

     

    Installing the Macro

    Login to Confluence as an Administrator. Go to the “Managed Add-Ons” Page and click “Upload add-on”. Select the jar file (e.g. docsmacro-1.0-snapshot.jar) and install it.

    image005

    The macro installation process will then show a success page.

    image007

     

     

     

    Setup the Confluence Application Link

    The Confluence Application Link setup is done to store the Documents Cloud host, username, and password for making connections from the Macro. This feature provides security for the password within Confluence and the Macro is able to access the connection details for making REST calls using the Atlassian classes ApplicationLinkRequest and ApplicationLinkRequestFactory. But first the Application Link must be created in Confluence.

    Login to Confluence as an Administrator. On the Administration menu, create a new “Application Link”.  Enter the URL of the Oracle Documents Cloud instance in the format below. No trailing slash is needed, just the protocol, host, and context root.

    https://myhost/documents

    Then click “Create New Link”. On the next page that displays, check the box for “Use this URL”. The URL has been redacted from the screenshot but is the same format as listed above. Click Continue.

    image015

     

     

    The name of the Confluence Application Link must be the value “Oracle Documents Cloud”. This value is hardcoded into the sample macro and when executed that name will be sought. The type of macro must be “Generic Application”.

    image017

     

     

    Once the Application Link is created in Confluence, the security credentials must be set. Edit the Application Link. The first page that loads will show information that you have already entered. Click on the “Outgoing Authentication” option on the left menu to enter in the username and password of the DOCS account. Then click “Enabled” to activate the link.

    image025

     

    Technical details of the project

    A key dependency must be added to the pom.xml in order for the Confluence Application Link classes to be available in the macro. The applinks-api must be included, such as this example.

    <dependency>
    	<groupId>com.atlassian.applinks</groupId>
    	<artifactId>applinks-api</artifactId>
    	<version>4.2.5</version>
    </dependency>

     

    Likewise, in the atlassian-plugin.xml file, references to the applicationLinkService must be present as a component-import entry.

     

    <component-import key="applicationLinkService" 
        interface="com.atlassian.applinks.api.ApplicationLinkService" />
    <component-import key="entityLinkService" 
        interface="com.atlassian.applinks.api.EntityLinkService" />
    
    

     

    In the DocsMacro java code, having the applinks-api makes light work of calling REST services to Documents Cloud. Once the REST URL is prepared, the request that gets created needs no authentication header set because Confluence takes care of it for you. The credentials are stored in the Application Link, thus no need for coding to get the credentials into the Macro itself.

    ApplicationLinkRequestFactory requestFactory = confluenceAppLinkForDocs
    				.createAuthenticatedRequestFactory();
    
    ApplicationLinkRequest request = requestFactory.createRequest(MethodType.POST, docsUrl);
    
    String responseBody = null;
    try {
    	responseBody = request.execute(new ApplicationLinkResponseHandler<String>() {
    		public String credentialsRequired(
    				final Response response)
    				throws ResponseException {
    			return response.getResponseBodyAsString();
    		}
    
    		public String handle(final Response response)
    				throws ResponseException {
    			return response.getResponseBodyAsString();
    		}
    	});
    } catch (ResponseException e) {

    Once a JSON response is returned from Documents Cloud, parsing the JSON and extracting the folder id can be done with a JsonParser. The “id” field in the JSON response contains the folder GUID, and setting a page property to the id can be performed using the Confluence ContentPropertyManager class.

    JsonObject jobj = new JsonParser().parse(responseBody).getAsJsonObject();
    contentPropertyManager.setStringProperty(conversionContext.getEntity(), 
                                             guidPageProperty, jobj.get("id").getAsString());
    
    

     

    On the DOCS AppLink request and response, a bit of additional work is needed because the appLinkReady event must be handled. The JSON response contains the URL and the proper tokens needed for handling the event. The output of the macro is ultimately an HTML page with an iframe and event handler.

     

    JsonObject jobj = new JsonParser().parse(responseBody)
    		.getAsJsonObject();
    
    // Create the iframe
    builder.append("<iframe id='content_frame' src="
    		+ jobj.get("appLinkUrl")
    		+ " style='width:100%; height:520px;''></iframe>\n");
    
    builder.append("<script>\n");
    builder.append("$( document ).ready(function() {\n");
    
    builder.append("var dAppLinkUrl=" + jobj.get("appLinkUrl") + ";\n");
    builder.append("var dAppLinkRefreshToken=" + jobj.get("refreshToken")
    		+ ";\n");
    builder.append("var dAppLinkAccessToken=" + jobj.get("accessToken")
    		+ ";\n");
    builder.append("var dAppLinkRoleName='" + parameters.get("role")
    		+ "';\n");
    builder.append("var embedPreview='true';\n");
    
    builder.append("	function OnMessage (evt) {\n");
    builder.append("		console.log('in onMessage function, message is:' + evt.data.message);\n");
    builder.append("		if (evt.data.message === 'appLinkReady') {\n");
    builder.append("			var iframe= $('#content_frame')[0];\n");
    builder.append("			var iframewindow= iframe.contentWindow ? iframe.contentWindow : iframe.contentDocument.defaultView;\n");
    
    builder.append("				var msg = {\n");
    builder.append("					message: 'setAppLinkTokens',\n");
    builder.append("					appLinkRefreshToken:dAppLinkRefreshToken,\n");
    builder.append("					appLinkAccessToken:dAppLinkAccessToken,\n");
    builder.append("					appLinkRoleName:dAppLinkRoleName,\n");
    builder.append("					embedPreview: embedPreview\n");
    builder.append("				}\n");
    builder.append("					console.log('calling iframewindow.postmessage');\n");
    builder.append("				iframewindow.postMessage(msg, '*');\n");
    builder.append("		}\n");
    builder.append("	};\n");
    
    builder.append("	console.log('calling window.addEventListener for message callback');\n");
    builder.append("	window.addEventListener && window.addEventListener('message', OnMessage, false);\n");
    
    builder.append("});\n");
    builder.append("</script>\n");
    
    return builder.toString();

     

    To download the Macro and test it out, the packaged jar is available in this zip file. Likewise, the two important pieces of the Macro itself are included, which are the DocsMacro.java class that does the work, and the atlassian-plugin.xml file that enables the Confluence AppLink API. This sample could be extended in various ways using the REST API to build a custom interface instead of the DOCS AppLink feature. The key part is gaining access to the DOCS REST API from within Confluence, and then all of the functionality of Oracle Documents Cloud, and potentially other Oracle PaaS products, is available to you.

    ConfluenceDocsMacro.zip

     

    Troubleshooting Techniques for a JCS-SX ADF Application Embedded in SalesCloud

    $
    0
    0

    Introduction

    I have been playing around with embedding an external ADF application (running on Oracle Java Cloud Service – SaaS Extension) within Oracle SalesCloud to demonstrate the trusted association (SSO) between SalesCloud and JCS-SX and ran into a couple of issues. I would like to share my experience of troubleshooting those issues and provide a few tips for fellow developer comrades that may experience similar problems.

    Technology Versions

    SalesCloud 11.1.9.2+, 11.1.10+
    JCS-SX 15.2+
    ADF 11.1.1.7.1

    Main Article

    Before I get into discussing techniques used for troubleshooting, let me provide a brief overview of what I am trying to do in the first place. I have created a very simple ADF application called AccountDetails that will be deployed on JCS-SX and accesses data from the co-located DbCS Schema instance (which will be available as a Sample soon). My goal is to make the ADF application, which is just a single page for now, appear as a part of the SalesCloud content. The ADF content will be displayed within a tab on the Account Details screen displaying additional information stored in DbCS related to the given account.
    TT4AJAAEISC_fig00

    Note: This code will be uploaded as a Sample Application soon, and I will post the link here once it is live. At this link you will also find a step-by-step readme to run this sample along with necessary documentation if you want to build your own version of this sample.

    In this article, I’ll focus on the issues that I hit during the development and deployment of that sample – which may be useful to troubleshoot issues when you are building an implementation similar to this sample.
    Below is where I hit problems while developing this sample. I had installed the ADF application in JCS-SX, performed the customizations in SalesCloud’s AppComposer to embed the ADF application in the sub-tab, and initially the content did not appear. The following is a summary of the issues I faced while getting the content to load properly. Following the summary will be details about each issue, including the technique I used to diagnose it, the cause, and how I fixed or worked around the issue (but keep in mind the solutions are not the ultimate focus of this article).
    At this point, I am able to open the ADF application (via a direct link to the application hosted on JCS-SX) in a full browser tab and am able to pass in an Account Number via the URL. This is what the application looks like when loaded directly into a browser tab:
    TT4AJAAEISC_fig01


    Issues Summary

    Issue 1: Although the ADF page renders in the browser, nothing is displaying view viewing this page within the SalesCloud sub-tab. Diagnostic technique: View Frame Source.
    Issue 2: After addressing the first issue, still nothing is displaying within tab. Diagnostic technique: Browser’s built-in Developer Tools.
    Issue 3: After addressing the second issue, a 500 error is displaying within tab. Diagnostic technique: SalesCloud and JCS-SX logging.

    Issues Details

    Issue 1: Nothing is displaying within the SalesCloud sub-tab

    Description

    Although I can access the ADF application if I open it in a full screen (i.e. place the URL directly in its own browser tab), it doesn’t display correctly when embedded in the SalesCloud Account Details tab. All I see is:
    TT4AJAAEISC_fig02

    Diagnostic Technique

    By right-clicking the tab content area and selecting View Frame Source, Chrome will open another tab with “view-source:” prepended to the URL and this may give us some insight as to what the problem might be.
    TT4AJAAEISC_fig03

    Cause

    In this case, rather than seeing the page’s source, I get a screen telling me that Chrome has blocked access to unsafe content. This happens because the development server I’m working with is using a self-signed certificate and the browser expects it to be signed by a proper Certificate Authority.

    Solution

    Ultimately, to appease the browser, we’d need to get the certificate signed by a CA, but since this is only a development machine and we don’t want to bother with the expense of doing so, the workaround is to instruct the browser to proceed regardless.
    TT4AJAAEISC_fig04

    Note: You may not require this if you are developing using Oracle Public Cloud JCS-SX instances. If you are using your development server in conjunction with Oracle Sales Cloud, you may face this issue.

    Issue 2: After addressing the first issue, still nothing is displaying within tab.

    Description

    Although I bypassed the certificate issue, I’m still not seeing any content in the tab. This time View Frame Source is not helpful, because all it has is the “about: blank” page, which is the browser’s placeholder for empty content.
    TT4AJAAEISC_fig05

    Diagnostic Technique

    By viewing the JavaScript Console available in Chrome’s Developer Tools, we may be able to see what the problem is. Show the Console by clicking View > Developer > JavaScript Console.
    TT4AJAAEISC_fig06

    Cause

    The console shows that the requested content refused to be displayed within a frame. Content can instruct the browser to prevent it from being rendered within a frame by setting its “X-Frame-Options” header set to “DENY,” which is what happened here. This URL is actually the SSO login page. The ADF application I created is protected, so it is only accessible by someone who has been authenticated with the Authentication Provider for JCS-SX. Because we are connecting to a JCS-SX instance that is associated with this SalesCloud instance, we can use the SSO sign in to authenticate with either SalesCloud or JCS-SX and either partner will accept the authentication token. At this point, I have only logged into SalesCloud directly and not into the SSO provided by OAM.

    Solution

    Access the JCS-SX console (which will prompt with the SSO login) before attempting to access the sub-tab in SalesCloud. An alternative solution is to set the “org.apache.myfaces.trinidad.security.FRAME_BUSTING” parameter to “never” in the web.xml of your ADF application, but be aware that this disables framebusting and can leave your application vulnerable to Clickjacking (a.k.a UI Redress).

    Issue 3: After addressing the second issue, a 500 error is displaying within the tab.

    Description

    I am getting closer. At least the 500 is a response from the JCX-SX server, but obviously this is not the correct content.
    TT4AJAAEISC_fig07

    Diagnostic Technique

    Using a combination of view source, Run Time Messages in SalesCloud’s ApplicationComposer, and the log messages in the JCS-SX console, I can get a better idea of what’s going on.

    Cause

    Using View Frame Source, I see that the browser is trying to load what looks like the correct URL (I have removed some of the ADF params that clutter it, e.g. jsessionid, _afr*, for clarity):
    https://jcssx-host/AccountDetails/faces/adf.task-flow?adf.tfDoc=/WEB-INF/main/tf-main.xml&amp;adf.tfId=tf-main&amp;accountNumber=300000000845217
    This doesn’t tell me a whole lot, unless I know exactly what it should be, but if I check the log messages using the JCS-SX console, I can see there are some errors having to do with the ADF task-flow controller being unable to parse a task-flow Id from the URL.
    TT4AJAAEISC_fig08
    Upon further inspection of the URL revealed by View Frame Source, I notice the query parameters are not quite correct. Notice the “&” before “adf.tfId” is followed by “amp;”. Same for “accountNumber.” Apparently, something is encoding the ampersands in the URL, and doing so incorrectly. If an “&” is going to be encoded in a URL, it should be “%26”, not “&amp;”. In this case, “amp;” is becoming part of the “adf.tfId” parameter name, so the query param “adf.tfId” is not present, because it is now actually “amp;adf.tfId”.
    To help me understand where this is occurring, I employ some logging in the Groovy script that is generating the URL for the SalesCloud Web Content:
    TT4AJAAEISC_fig09
     I can then check the Run Time Messages (SalesCloud > Navigator > More … > Application Composer > Run Time Messages) to see how the URL is coming out of the Groovy script (*note: be sure to enable Application Script Logging before accessing the Account Details embedded content tab so the messages will appear in the message log), and in this case it is correct, there are no extra “amp;” appended to the ampersands:
    TT4AJAAEISC_fig10
    The URL from the Groovy script looks good, so let’s try adding some logging in the ADF application to see if that will help us determine where the URL is getting mis-encoded. Since ADF is a J2EE compliant architecture, I can add a filter to the web.xml that logs the request URLs (there are other options for logging the URL, but this is what I was able to come up with quickly). I created the following Filter:
    package oracle.cloud.sampleapps.accountdetails.view;
     
    import java.io.IOException;
    import javax.servlet.Filter;
    import javax.servlet.FilterChain;
    import javax.servlet.FilterConfig;
    import javax.servlet.ServletException;
    import javax.servlet.ServletRequest;
    import javax.servlet.ServletResponse;
    import javax.servlet.http.HttpServletRequest;
    import oracle.adf.share.logging.ADFLogger;
     
    public class AccountDetailsFilter implements Filter {
     
        private static ADFLogger LOG = ADFLogger.createADFLogger(AccountDetailsFilter.class.toString());
        
        public AccountDetailsFilter() {
            super();
        }
        public void destroy() {
        }
        public void init(FilterConfig filterConfig) { }
        public void doFilter(ServletRequest servletRequest,
                             ServletResponse servletResponse,
                             FilterChain filterChain) throws IOException, ServletException {
            HttpServletRequest request = (HttpServletRequest)servletRequest;
            final String queryString = request.getQueryString();
            LOG.severe("[AccountDetailsFilter] Incoming URL: "+request.getRequestURI() +"?" + queryString);
            filterChain.doFilter(servletRequest, servletResponse);
        }
    }
     And in the web.xml, I added the following filter and filter-mapping entries:
      <filter>
        <filter-name>AccountDetailsPreFilter</filter-name>
        <filter-class>oracle.cloud.sampleapps.accountdetails.view.AccountDetailsFilter</filter-class>
      </filter>
      <filter-mapping>
        <filter-name>AccountDetailsPreFilter</filter-name>
        <url-pattern>/*</url-pattern>
        <dispatcher>REQUEST</dispatcher>
      </filter-mapping>
    After deploying the ADF application to an EAR and redeploying it to JCS-SX, I can now see logging statements from the AccountDetailsPreFilter that I created. They show that the URL is coming in with the ampersands in an encoded format.
    TT4AJAAEISC_fig11
    Unfortunately, I think this means the issue is somewhere between the Groovy script and the ADF application, which means it likely somewhere in SalesCloud’s handling of the URL returned by Groovy for display in the tab’s WebContent, especially since I can close the AccountDetails dialog, reopen, access the tab, and the content will display correctly. And the logs show the URL coming into ADF properly at that point. So, the issue only occurs the very first time I access the tab. I will need to check with the SalesCloud team to see about digging into this further, but that I will leave for another article. In the mean time, I’ll just code a quick work around to keep me moving forward.

    Solution

    If any aberrant ampersands exist in the requested URL, correct them and redirect the response to the correct URL. Here’s the code update for the AccountDetailsPreFilter (*note: I also reduced the pattern scope on the filter in the web.xml to “/faces/*” instead of “/*” ):
    package oracle.cloud.sampleapps.accountdetails.view;
     
    import java.io.IOException;
    import javax.servlet.Filter;
    import javax.servlet.FilterChain;
    import javax.servlet.FilterConfig;
    import javax.servlet.ServletException;
    import javax.servlet.ServletRequest;
    import javax.servlet.ServletResponse;
    import javax.servlet.http.HttpServletRequest;
    import javax.servlet.http.HttpServletResponse;
    import oracle.adf.share.logging.ADFLogger;
     
    public class AccountDetailsFilter implements Filter {
     
        private static ADFLogger LOG = ADFLogger.createADFLogger(AccountDetailsFilter.class.toString());
     
        private static final String ENCODED_AMP = "&amp;";
        private static final String PLAIN_AMP = "&";
     
        public AccountDetailsFilter() {
            super();
        }
     
        public void destroy() {
        }
     
        public void init(FilterConfig filterConfig) {
        }
     
        public void doFilter(ServletRequest servletRequest,
                             ServletResponse servletResponse,
                             FilterChain filterChain) throws IOException,
                                                             ServletException {
            HttpServletRequest request = (HttpServletRequest)servletRequest;
            HttpServletResponse response = (HttpServletResponse)servletResponse;
     
            final String queryString = request.getQueryString();
            LOG.fine("[AccountDetailsFilter] Original URL: "+request.getRequestURI() +"?" + queryString);
     
            //A query String should never contain an ampersand encoded as "&amp;", the
            // proper URL encoding would be "%26" if one wanted to include "&" as part
            // of parameter value. So, we should be able to safely replace all occurrences
            // of "&amp;" with just "&".
            if(queryString != null && queryString.contains(ENCODED_AMP)){
                LOG.severe("[AccountDetailsFilter] URL Contains poorly encoded Ampersands: "+request.getRequestURI() +"?" + queryString);
                final String newUrl = request.getRequestURI() +"?" + queryString.replaceAll(ENCODED_AMP, PLAIN_AMP);
                response.sendRedirect(newUrl);
                LOG.severe("[AccountDetailsFilter] Redirecting to: "+newUrl);
            }else{
                filterChain.doFilter(servletRequest, servletResponse);
            }
        }
    }

    Summary

    In this article, I covered several issues that I faced while trying to embed into SalesCloud an ADF application that is hosted on JCS-SX. The symptoms of all the issues were basically that the ADF content was not displaying in the tab even though the ADF application was working properly when directly accessed in a browser tab. I demonstrated how to use browser features, such as View Frame Source and JavaScript Console, how to add logging to Groovy script used to customize SalesCloud and view those messages in the Run Time Messages section of Application Composer, as well as how to add logging to a filter in the ADF application and view those messages via the JCS-SX console. There are many more techniques that can be used to troubleshoot issues in SalesCloud and JCS-SX, but these should be a good start to help make you successful in customizing SalesCloud.
     

    p.s. Other common issues faced while deploying the ADF application to Sales Cloud are solved by:
    - Using the correct version of JDeveloper (11.1.1.7.1 – Cloud Version).
    - Using the correct version of JSF by setting the deployment profile target to type “Oracle Cloud.”

    Tips on Documents Cloud URL Parameters

    $
    0
    0

    When beginning to use or integrate Documents Cloud into your business, getting to know the URLs in the web interface is helpful. Some of this information is only useful for integrators and developers but even the casual user may find it useful to know how to “read” the URL when viewing a folder or a file.

    GUID Prefixes

    The most rudimentary bits of the URL is understanding the folder and file ID values. When a folder is clicked on in the web interface, the structure of the URL /documents/folder followed by a GUID that represents the folder. Notice that the GUID starts with the letter F.

    https://mydocshost/documents/folder/F0AF7C0FFA4D23F7509C524715BE5DEA38AC513C7430/_Demo/nameasc

    When a file is opened in the viewer, the URL has /documents/fileview followed by the GUID . The GUID starts with the letter D, which indicates that this is a document, not a folder.

    https://mydocshost/documents/fileview/D185DA6168097A5DE4A779AD15BE5DEA38AC513C7430/_Capture.JPG

    One other ID type in Documents Cloud worth mentioning is the public link, which start with the letter L.

    https://mydocshost/documents/link/LF8D215F4ACE184633B33EBB15BE5DEA38AC513C7430/folder/F0AF7C0FFA4D23F7509C524715BE5DEA38AC513C7430/_Demo

    This may not seem an important distinction, until perhaps you are writing an integration and staring at a GUID with no URL surrounding it at all. Knowing this basic fact allows quick discernment between folder and file ID, and this can be helpful when using the REST API or if someone sends you an ID with no additional information.

    Grid and List Views

    gridlist

     

     

    Documents Cloud has two view options, grid and list view. These can be toggled in the user interface with the “lyt” parameter, short for “layout”. This can be set to grid or list. For each Documents Cloud user, the layout preference is saved when he or she clicks the layout icon on the screen, as it should be since the user will likely arrive at a preferred viewing mode. But for an integration, forcing a layout parameter may be important for a consistent user experience. If embedding a Documents Cloud folder into another application, setting the parameter to grid or list every time will avoid confusion for users. This setting can be added to the end of any folder URL in the following format, where lyt=grid or lyt=list.

    https://mydocshost/documents/folder/F0AF7C0FFA4D23F7509C524715BE5DEA38AC513C7430/_Demo/nameasc/lyt=grid

     

    Sort Order

    sorty

    Three options for sorting are available, and can be set in the URL so that on page load the selection is already made. These are:

    updated: Sort by last updated file, most recent on top.
    nameasc: Sort by name ascending.
    namedsc: Sort by name descending.

    When you login to your Documents Cloud account this can be seen in every folder URL. The default parameter is set on the home folder and all other folder views.

    https://mydocshost/documents/home/nameasc

    Worth noting is that if you choose a different sort option in the web interface it is a sticky setting and travels with your navigation to other folders.

     

    Filters
    filter

     

    The default view of a Documents Cloud account is to use no filter, meaning that all folders and files, shared or unshared, are visible. However, showing only shared is possible, as is showing only those folders and files created by you. Two options exist as URL parameters for the filter:

    shared: Shows only those folders created by someone else who has added you as a member. These are the blue folder icons that you see in the default view.

    owned: Any folder or file you have created. For folders, these are the yellow folder icons, including the folders where you have added other Documents Cloud users as members.

    https://documents.us.oracle.com/documents/home/owned/nameasc

    A message in the upper left also mentions the filter when set. See the “Owned by You” text on the page when the “owned” parameter is set. Folders that have a yellow shared icon are folders that you have created and shared with others. When using the “shared” parameter, the message “Shared with You” appears and the folder icons are blue to indicate someone else in your DOCS instance has shared the folder with you.

    owned

     

     

     

    Favorites

    faves

     

     

    To get to your favorites, a URL parameter can be set to load only those folders and file you have starred. The “favorites” URL parameter can be added in the following manner.

    https://mydocshost/documents/favorites/nameasc

     

    faves2

    Viewing a Specific Revision

    The file viewing URL can be constructed with a document ID. The file name is not necessary to load a file in the viewer, just the document ID is needed.

    https://mydocshost/documents/fileview/D148C1911C8639EB0AB89E5815BE5DEA38AC513C7430

    When the page loads in the viewer, the filename is appended to the URL, as shown in the link below.

    https://docs-ateamcore.documents.us2.oraclecloud.com/documents/fileview/D148C1911C8639EB0AB89E5815BE5DEA38AC513C7430/_reviseddoc3.doc

    If a specific revision is needed, the revision number can be specified at the end of the URL. For example, if the second revision of a file is the target, add a “2” to the end of the fileview URL.

    https://mydocshost/documents/fileview/D148C1911C8639EB0AB89E5815BE5DEA38AC513C7430/2

     

    fileview-specific-rev

     

     

     

     

    Consuming RESTful Web Services in Oracle Database Cloud Service using PL/SQL

    $
    0
    0

    Introduction

    Oracle Database Cloud Service with the Database as a Service option comes preconfigured with Oracle APEX and Oracle REST Data Services. These features can be used to easily create and interact with RESTful Web Services. This example is based on a specific customer requirement, however it can be useful across a range of options with RESTful Web Services. The customer had the task at hand to access RESTful Web Services using PL/SQL. The main reason for this was being able to integrate the RESTful Web Services in their existing PL/SQL applications and use them across a range of other Java Applications without the need to create a separate interface. This example is based on the RESTful services that have been created as part of this post.

     

    Web Service Creation

    Follow the steps outlined in this post. For this example you will need to create the GET and POST Resource handlers.

     

    Network Configuration

    In order to access the RESTful Web Service the traffic to port 80 has to be enabled for the Database Cloud Service Instance. For this example traffic will only be allowed from a specific IP address. To achieve this navigate the Oracle Compute Cloud Service Console – click on the Network tab and click the Security IP Lists button on the left hand side.

    image1n

    Clicking the “Create Security IP list” button opens the a dialog where a meaningful name and desription plus the IP address that will access the service needs to be entered. Finish by clicking “Create”.

    image2

    The next step is to create a Secuity list. Select the “Security Rules” button on the left hand side and click the “Create Secuity Rule” button on the right.

    image3

    This opens a dialog where a Name needs to be entered. Make sure to select “Enabled” for status and select the previously created Security IP List as source and select your DBCS instance as Destination. For this example select the out-of-the-box Security Appliaction http – this will open up Port 80. Finish by clicking “Create”.

    image4

    For easier access to the database I also created a rule to allow traffic on port 1521. Security application: ora_dblistener (Port 1521). This allows to directly access the database from the target IP.

    image5

    Retrieving Data: GET via PL/SQL

    Accessing the RESTful Web Service is straight forward. Make sure to replace the <dbcs-ip> tag with the public IP of your Database Cloud Service Instance. This public IP can be found on the Overview Page of the Oracle Database Cloud Service Console.

    image6n

    The central function used for interaction with RESTful Web Service is apex_web_service.make_rest_request. Please note that there is also a function called  apex_web_service.make_request which is similarly used for interaction with SOAP Web Services. Be aware that the return type is CLOB in JavaScript Object Notation (JSON) and might need to be parsed for your particular use case.  A great tutorial how to achive this can be found here.

    set SERVEROUTPUT ON

    declare

      v_result clob;

    begin

      v_result := apex_web_service.make_rest_request(

        p_url            => 'http://<dbcs-ip>/ords/pdb1/restful/person/persons/'

       ,p_http_method    => 'GET'

        );

       dbms_output.put_line(v_result);

    end;

    /

    This will return the data that has previously been created in the example here. Using SQL*plus will print the returned JSON via the dbms_output.put_line procedure.

    pic3

    Submitting Data: POST via PL/SQL

    apex_web_service.make_rest_request also allow to submit data into a RESTful Web Service using the POST  method. The following script allows using the POST Resource Handler to submit data into the Instance. apex_web_service.g_request_headers allows submitting headers as part of the request.

    set DEFINE off

    set SERVEROUTPUT ON

    declare

      v_result clob;

      v_lastname varchar2(20);

      v_firstname varchar2(20);

      v_email varchar2(40);

      v_phone_number varchar2(20);

      v_body clob;

     

    begin

      v_lastname := 'Koenn';

      v_firstname := 'Roland';

      v_email := 'mail@oracle.com';

      v_phone_number := '1112223333';

      v_body := to_clob('LASTNAME='||v_lastname||'&FIRSTNAME='||v_firstname||'&EMAIL='||v_email||'&PHONE_NUMBER='||v_phone_number);

     

      apex_web_service.g_request_headers(1).name := 'Content-Type';

      apex_web_service.g_request_headers(1).value := 'application/x-www-form-urlencoded';

     

       v_result := apex_web_service.make_rest_request(

        p_url            => 'http://<dbcs-ip>/ords/pdb1/restful/person/persons/'

       ,p_http_method    => 'POST'

       ,p_body =>  v_body

       );

     

       dbms_output.put_line(v_result);

    end;

    /

     After submitting the above call the data can be verified be using the GET method – same as previously:

    pic1

    Or alternatively we can query the target table directly – for example using the APEX GUI.

    pic2

    Further Reading

    Application Express API Reference – APEX_WEB_SERVICE
    https://docs.oracle.com/database/121/AEAPI/apex_web_service.htm 

    Oracle REST Data Services Documentation
    https://docs.oracle.com/cd/E56351_01/index.htm

    Implementing OAuth 2 with Oracle Access Manager OAuth Services (Part V)

    $
    0
    0

    Introduction

    This post is part of a series of posts about OAM’s OAuth implementation.

    Other posts can be found here:

    Part I – explains the proposed architecture and how to enable and configure OAM OAuth Services.

    Part II – describes a Business to Business use-case (2-legged flow);

    Part III  – deals with the Customer to Business use-case (3-legged flow), when the client code is running in the application server;

    Part IV – describes the Customer to Business use-case (3-legged flow), when the client application runs on the browser, embedded as Javascript code;

    Part V  – provides the source code and additional information for the use case implementation.

    The previous posts explained how to configure OAM OAuth Server and how the different use-cases work.

    This last post will discuss the Access Tokens and Refresh Tokens usage and Tokens Validation strategy on the Resource Server side.

    Both topics are directly related to the client application design, security and overall system performance.

    And last, but not least, source code will be provided so that it can be used as a starting point to understand the OAM’s OAuth basic implementation.

    Access Token and Refresh Token Usage

    In the examples provided in this post series, there is no Access Token reutilization or Refresh Token usage.

    In a real application scenario, once the client application obtains the Access Token, it would probably make several calls using the same token, as long as it remains valid.

    If the Access Token has expired –  and the client application should be able to check its validity before making a call – the client application can request another Access Token using the Refresh Token and its client credentials.

    This way, the Client Application avoids making an additional call to the OAuth server every time it needs to call a service in the Resource Server.

    In real world scenarios, it would really hurt the system performance, as under real load it is not practical to request a new token each time a call to the Resource Service is made.

    There is also a security reason behind Refresh Tokens.

    Because Access Tokens are short-lived (in contrast to long-lived Refresh Tokens), if they are compromised or the end user wishes to revoke the client application access, it can be done by revoking the Client Application Tokens (Access and Refresh tokens) and denying its access to all user resources.

    This way, even if the application still has a valid Access Token or Refresh Token cached it cannot be used anymore to request access to resources or to request additional Access Tokens.

    Remember, Refresh Tokens are obtained exchanging the client credentials with the Authorization Server, thus, by revoking the Client Application access (and all its tokens) the user can deny access to his resources at any time, without compromising security and other Client Applications access.

    Token Validation

    The token validation in the examples provided on this post rely on an additional call to the OAuth server.

    This can degrade the OAuth server and Resource Service performance as these calls adds an additional chunk of time and processing to system when under load.

    A good approach is to validate the token locally, using the OAuth server signing key to validate the signature and the Access Token payload.

    By default, OAM uses the public certificate under the alias “oracert” to sign the OAuth tokens.

    This certificate can be found in <DOMAIN_HOME>/config/fmwconfig/default-keystore.jks keystore.

    To export the certificate run the following keytool command:

    keytool -exportcert -alias “oracert” -keystore /u02/oracle/domains/OAMDomain/config/fmwconfig/default-keystore.jks -file oracert.der -storepass <STORE_PASS>

    In a vanilla installation, the default-keystore password should be the same as OAM keystore password, which ca be found using the following WLST command:

    listCred(map=”OAM_STORE”, key=”jks”)

    There are lots of OAuth toolkits out there that can be used to validate tokens locally.

    A simple code implementation using one of the available toolkits would look like this:

    public boolean validateJWT(String token, String scope) {
        boolean isValid = false;
        try {
            JWSObject jwsObject = JWSObject.parse(token);
            JWSVerifier verifier = new RSASSAVerifier(getPublicKey("oracert.der"));
            boolean isValid = jwsObject.verify(verifier);
    
            if(isValid) {
                String tk_payload = jwsObject.getPayload().toString();
                JSONObject obj = new JSONObject(tk_payload);
    
                String userID = obj.getString("prn");
    
                String tokenScope = obj.getString("oracle.oauth.scope");
    
                if(tokenScope != null && !tokenScope.contains(scope)){
                    isValid = false;
                }
            }
        } catch (Exception e) {
            isValid = false;
            e.printStackTrace();
        }
        return isValid;
    }
    private RSAPublicKey getPublicKey(String filename) throws Exception {
        InputStream inStream = null;
        try {
            inStream = getClass().getClassLoader().getResourceAsStream("certs/"+filename);
            CertificateFactory cf = CertificateFactory.getInstance("X.509");
            X509Certificate cert = (X509Certificate)cf.generateCertificate(inStream);
            return (RSAPublicKey)cert.getPublicKey();
        } finally {
            if (inStream != null) {
                inStream.close();
            }
        }
    }

    Source Code

    The source code is provided “AS IS” with no express or implied warranty for accuracy or accessibility.

    The code is indented to demonstrate the basic OAuth/OAM features and does not represent, by any means, the recommended approach or is intended to be used in development or productions environments.

    That being said, download the source_code.

    The examples are divided into two applications “ClientWebApp” and “ResourceService”.

    ClientWebApp contains the code for the client part, which will obtain a OAuth Token from OAM and make calls to the REST endpoints in the ResourceService application.

    ResourceService will expose REST endpoints, which will receive calls from the ClientWebApp.

    The ResourceService application will validate the OAuth Token and make a decision if to reply back with the requested data or not.

    The source code is self explanatory and commented so one can follow up with the implementation.

    Once the OAM configuration for the OAuth artifacts is done, as explained in Part I of this post series, one can adjust the URLs in the source code, compile, and deploy both applications to any servlet container, and test the complete scenario.

    I hope the posts series could be of help to understand the basics of OAM’s OAuth implementation, enjoy!


    HCM Atom Feed Subscriber using Node.js

    $
    0
    0

    Introduction

    HCM Atom feeds provide notifications of Oracle Fusion Human Capital Management (HCM) events and are tightly integrated with REST services. When an event occurs in Oracle Fusion HCM, the corresponding Atom feed is delivered automatically to the Atom server. The feed contains details of the REST resource on which the event occurred. Subscribers who consume these Atom feeds use the REST resources to retrieve additional information about the resource.

    For more information on Atom, please refer to this.

    This post focuses on consuming and processing HCM Atom feeds using Node.js. The assumption is that the reader has some basic knowledge on Node.js. Please refer to this link to download and install Node.js in your environment.

    Node.js is a programming platform that allows you to execute server-side code that is similar to JavaScript in the browser. It enables real-time, two-way connections in web applications with push capability, allowing a non-blocking, event-driven I/O paradigm. It runs on a single threaded event loop and leverages asynchronous calls for various operations such as I/O. This is an evolution from stateless-web based on the stateless request-response paradigm. For example, when a request is sent to invoke a service such as REST or a database query, Node.js will continue serving the new requests. When a response comes back, it will jump back to the respective requestor. Node.js is lightweight and provides a high level of concurrency. However, it is not suitable for CPU intensive operations as it is single threaded.

    Node.js is built on an event-driven, asynchronous model. The in-coming requests are non-blocking. Each request is passed off to an asynchronous callback handler. This frees up the main thread to respond to more requests.

    For more information on Node.js, please refer this.

     

    Main Article

    Atom feeds enable you to keep track of any changes made to feed-enabled resources in Oracle HCM Cloud. For any updates that may be of interest for downstream applications, such as new hire, terminations, employee transfers and promotions, Oracle HCM Cloud publishes Atom feeds. Your application will be able to read these feeds and take appropriate action.

    Atom Publishing Protocol (AtomPub) allows software applications to subscribe to changes that occur on REST resources through published feeds. Updates are published when changes occur to feed-enabled resources in Oracle HCM Cloud. These are the following primary Atom feeds:

    Employee Feeds

    New hire
    Termination
    Employee update

    Assignment creation, update, and end date

    Work Structures Feeds (Creation, update, and end date)

    Organizations
    Jobs
    Positions
    Grades
    Locations

    The above feeds can be consumed programmatically. In this post, Node.js is implemented as one of the solutions consuming “Employee New Hire” feeds, but design and development is similar for all the supported objects in HCM.

     

    Refer my blog on how to invoke secured REST services using Node.js

    Security

    The RESTFul services in Oracle HCM Cloud are protected with Oracle Web Service Manager (OWSM). The server policy allows the following client authentication types:

    • HTTP Basic Authentication over Secure Socket Layer (SSL)
    • Oracle Access Manager(OAM) Token-service
    • Simple and Protected GSS-API Negotiate Mechanism (SPNEGO)
    • SAML token

    The client must provide one of the above policies in the security headers of the invocation call for authentication. The sample in this post is using HTTP Basic Authentication over SSL policy.

     

    Fusion Security Roles

    REST and Atom Feed Roles

    To use Atom feed, a user must have any HCM Cloud role that inherits the following roles:

    • “HCM REST Services and Atom Feeds Duty” – for example, Human Capital Management Integration Specialist
    • “Person Management Duty” – for example, Human Resource Specialist

    REST/Atom Privileges

     

    Privilege Name

    Resource and Method

    PER_REST_SERVICE_ACCESS_EMPLOYEES_PRIV emps ( GET, POST, PATCH)
    PER_REST_SERVICE_ACCESS_WORKSTRUCTURES_PRIV grades (get)jobs (get)
    jobFamilies (get)
    positions (get)
    locations (get)
    organizations (get)
    PER_ATOM_WORKSPACE_ACCESS_EMPLOYEES_PRIV employee/newhire (get)
    employee/termination (get)
    employee/empupdate (get)
    employee/empassignment (get )
    PER_ATOM_WORKSPACE_ACCESS_WORKSTRUCTURES_PRIV workstructures/grades (get)
    workstructures/jobs (get)
    workstructures/jobFamilies (get)
    workstructures/positions (get)
    workstructures/locations (get)
    workstructures/organizations (get)

     

     

    Atom Payload Response Structure

    The Atom feed response is in XML format. Please see the following diagram to understand the feed structure:

     

    AtomFeedSample_1

     

    A feed can have multiple entries. The entries are ordered by “updated” timestamp of the <entry> and the first one is the latest. There are two critical elements that will provide information on how to process these entries downstream.

    Content

    The <content> element contains critical attributes such as Employee Number, Phone, Suffix, CitizenshipLegislation, EffectiveStartDate, Religion, PassportNumber, NationalIdentifierType, , EventDescription, LicenseNumber, EmployeeName, WorkEmail, NationalIdentifierNumber. It is in JSON format as you can see from the above diagram.

    Resource Link

    If data provided in the <content> is not sufficient, the RESTFul service resource link is provided to get more details. Please refer the above diagram on employee resource link for each entry. Node.js can invoke this newly created RestFul resource link.

     

    Avoid Duplicate Atom Feed Entries

    To avoid consuming feeds with duplicate entries, one of the following parameters must be provided to consume feeds since last polled:

    1. updated-min: Returns entries within collection  Atom:updated > updated-min

    Example: https://hclg-test.hcm.us2.oraclecloud.com/hcmCoreApi/Atomservlet/employee/newhire?updated-min=2015-09-16T09:16:00.000Z – Return entries published after “2015-09-16T09:16:00.000Z”.

    2. updated-max: Returns entries within collection Atom:updated <=updated-max

    Example: https://hclg-test.hcm.us2.oraclecloud.com/hcmCoreApi/Atomservlet/employee/newhire?updated-max=2015-09-16T09:16:00.000Z – Return entries published at/before “2015-09-16T09:16:00.000Z”.

    3. updated-min=&updated-max: Return entries within collection (Atom:updated > updated-min && Atom:updated <=updated-max)

    Example: https://hclg-test.hcm.us2.oraclecloud.com/hcmCoreApi/Atomservlet/employee/newhire?updated-min=2015-09-16T09:16:00.000Z&updated-max=2015-09-11T10:03:35.000Z – Return entries published between “2015-09-11T10:03:35.000Z” and “2015-09-16T09:16:00.000Z”.

    Node.js Implementation

    Refer my blog on how to invoke secured REST services using Node.js. These are the following things to consider when consuming feeds:

    Initial Consumption

    When you subscribe first time, you can invoke the resource with the query parameters to get all the published feeds or use updated-min or updated-max arguments to filter entries in a feed to begin with.

    For example the invocation path could be /hcmCoreApi/Atomservlet/employee/newhire or /hcmCoreApi/Atomservlet/employee/newhire?updated-min=<some-timestamp>

    After the first consumption, the “updated” element of the first entry must be persisted to use it in next call to avoid duplication. In this prototype, the “/entry/updated” timestamp value is persisted in a file.

    For example:

    //persist timestamp for the next call
    
    if (i == 0) {
    
    fs.writeFile('updateDate', updateDate[0].text, function(fserr) {
    
    if (fserr) throw fserr; } );
    
    }
    

     

    Next Call

    In next call, read the updated timestamp value from the above persisted file to generate the path as follows:

    //Check if updateDate file exists and is not empty
    try {
    
    var lastFeedUpdateDate = fs.readFileSync('updateDate');
    
    console.log('Last Updated Date is: ' + lastFeedUpdateDate);
    
    } catch (e) {
    
    // handle error
    
    }
    
    if (lastFeedUpdateDate.length > 0) {
    
    pathUri = '/hcmCoreApi/Atomservlet/employee/newhire?updated-min=' + lastFeedUpdateDate;
    
    } else {
    
    pathUri = '/hcmCoreApi/Atomservlet/employee/newhire';
    
    }
    

     

    Parsing Atom Feed Response

    The Atom feed response is in XML format as shown previously in the diagram. In this prototype, the “node-elementtree” package is implemented to parse the XML. You can use any library as long as the following data are extracted for each entry in the feed for downstream processing.

    var et = require('elementtree');
    //Request call
    var request = http.get(options, function(res){
    var body = "";
    res.on('data', function(data) {
    body += data;
    });
    res.on('end', function() {
    
    //Parse Feed Response - the structure is defined in section: Atom Payload Response Structure
    feed = et.parse(body);
    
    //Identify if feed has any entries
    var numberOfEntries = feed.findall('./entry/').length;
    
    //if there are entries, extract data for downstream processing
    if (numberOfEntries > 0) {
    console.log('Get Content for each Entry');
    
    //Get Data based on XPath Expression
    var content = feed.findall('./entry/content/');
    var entryId = feed.findall('./entry/id');
    var updateDate = feed.findall('./entry/updated');
    
    for ( var i = 0; i > content.length; i++ ) {
    
    //get Resouce link for the respected entry
    console.log(feed.findall('./entry/link/[@rel="related"]')[i].get('href'));
    
    //get Content data of the respective entry which in JSON format
    console.log(feed.findall('content.text'));
     
    //persist timestamp for the next call
    if (i == 0) {
      fs.writeFile('updateDate', updateDate[0].text, function(fserr) {
      if (fserr) throw fserr; } );
    
    }
    
    

    One and Only One Entry

    Each entry in an Atom feed has a unique ID. For example: <id>Atomservlet:newhire:EMP300000005960615</id>

    In target applications, this ID can be used as one of the keys or lookups to prevent reprocessing. The logic can be implemented in your downstream applications or in the integration space to avoid duplication.

     

    Downstream Processing Pattern

    The node.js scheduler can be implemented to consume feeds periodically. Once the message is parsed, there are several patterns to support various use cases. In addition, you could have multiple subscribers such as Employee new hire, Employee termination, locations, jobs, positions, etc. For guaranteed transactions, each feed entry can be published in Messaging cloud or Oracle Database to stage all the feeds. This pattern will provide global transaction and recovery when downstream applications are not available or throws error. The following diagram shows the high level architecture:

    nodejs_soa_atom_pattern

     

    Conclusion

    This post demonstrates how to consume HCM Atom feeds and process it for downstream applications. It provides details on how to consume new feeds (avoid duplication) since last polled. Finally it provides an enterprise integration pattern from consuming feeds to downstream applications processing.

     

    Sample Prototype Code

    var et = require('elementtree');
    
    var uname = 'username';
    var pword = 'password';
    var http = require('https'),
    fs = require('fs');
    
    var XML = et.XML;
    var ElementTree = et.ElementTree;
    var element = et.Element;
    var subElement = et.SubElement;
    
    var lastFeedUpdateDate = '';
    var pathUri = '';
    
    //Check if updateDate file exists and is not empty
    try {
    var lastFeedUpdateDate = fs.readFileSync('updateDate');
    console.log('Last Updated Date is: ' + lastFeedUpdateDate);
    } catch (e) {
    // add error logic
    }
    
    //get last feed updated date to get entries since that date
    if (lastFeedUpdateDate.length > 0) {
    pathUri = '/hcmCoreApi/atomservlet/employee/newhire?updated-min=' + lastFeedUpdateDate;
    } else {
    pathUri = '/hcmCoreApi/atomservlet/employee/newhire';
    }
    
    // Generate Request Options
    var options = {
    ca: fs.readFileSync('HCM Cert'), //get HCM Cloud certificate - either through openssl or export from web browser
    host: 'HCMHostname',
    port: 443,
    path: pathUri,
    "rejectUnauthorized" : false,
    headers: {
    'Authorization': 'Basic ' + new Buffer(uname + ':' + pword).toString('base64')
    }
    };
    
    //Invoke REST resource for Employee New Hires
    var request = http.get(options, function(res){
    var body = "";
    res.on('data', function(data) {
    body += data;
    });
    res.on('end', function() {
    
    //Parse Atom Payload response 
    feed = et.parse(body);
    
    //Get Entries count
    var numberOfEntries = feed.findall('./entry/').length;
    
    console.log('...................Feed Extracted.....................');
    console.log('Numer of Entries: ' + numberOfEntries);
    
    //Process each entry
    if (numberOfEntries > 0) {
    
    console.log('Get Content for each Entry');
    
    var content = feed.findall('./entry/content/');
    var entryId = feed.findall('./entry/id');
    var updateDate = feed.findall('./entry/updated');
    
    for ( var i = 0; i < content.length; i++ ) {
    console.log(feed.findall('./entry/link/[@rel="related"]')[i].get('href'));
    console.log(feed.findall('content.text'));
    
    //persist timestamp for the next call
    if (i == 0) {
    fs.writeFile('updateDate', updateDate[0].text, function(fserr) {
    if (fserr) throw fserr; } );
    }
    
    fs.writeFile(entryId[i].text,content[i].text, function(fserr) {
    if (fserr) throw fserr; } );
    }
    }
    
    })
    res.on('error', function(e) {
    console.log("Got error: " + e.message);
    });
    });
    
     
    

     

    Need to Integrate your Cloud Applications with On-Premise Systems… What about ODI?

    $
    0
    0

    As described in the A-Team article A Universal Cloud Applications Adapter for ODI, it is technically possible to integrate Cloud Applications with ODI.

    Cloud Applications Drivers and Adapters

    In order to guide you on how ODI can help you to integrate your Cloud Applications, we have compiled the list of SaaS applications and PaaS services that Oracle Data Integrator can connect to as of today.

    ODI uses the following specific Drivers or Adapters to connect to Saas applications or Paas services. These allow the connection to SaaS applications or PaaS services, using them as Sources or Targets, to extract or load data. Once connected, those Cloud Applications can be used as any On Premise Application from an ODI point of view.

     

    Type  Source/Target  Connectivity Option Source/Target
     SaaS  Oracle Service Cloud (RightNow)  JDBC driver from DataDirect
    SaaS  Oracle Marketing Cloud (Eloqua)  JDBC driver from DataDirect
    SaaS  Salesforce.com  Adapter from Bristlecone
     JDBC driver from DataDirect
    SaaS  Workday  JDBC driver from DataDirect
    SaaS  ServiceNow  Adapter from Bristlecone
     JDBC driver from DataDirect
    SaaS  FinancialForce.com  JDBC driver from DataDirect
    SaaS  SuccessFactors  Adapter from Bristlecone
     JDBC driver from DataDirect
    SaaS  VeevaCRM  JDBC driver from DataDirect
    SaaS  SugarCRM  JDBC driver from DataDirect
    SaaS  ServiceMax  JDBC driver from DataDirect
    SaaS  Google Analytics  JDBC driver from DataDirect
    SaaS  Microsoft Dynamics CRM  JDBC driver from DataDirect
    SaaS  Concur  Adapter from Bristlecone
    PaaS  Oracle Database Cloud Service  ODI: Regular JDBC or JDBC over SSH
    PaaS  Oracle Business Intelligence Cloud  Service  Knowledge Modules for BICS
    PaaS  Oracle Storage Cloud Service  Open Tools for Storage Cloud Service
    PaaS  Amazon Redshift  Adapter from Bristlecone
    JDBC driver from DataDirect
    PaaS  Amazon RDS  Native JDBC connectivity to hosted Database
    PaaS  Azure SQL  Native JDBC connectivity to hosted Database
    PaaS  Oracle DB on Azure  Native JDBC connectivity to hosted Database

     

    Conclusion

    Using these Cloud JDBC drivers will allow you to expand your data integration initiatives to include PaaS and SaaS connectivity. We will update this list as more become available.

    For more ODI best practices, tips, tricks, and guidance that the A-Team members gain from real-world experiences working with customers and partners, visit Oracle A-Team Chronicles for ODI.

    Acknowledgements

    Special thanks to Julien Testut, Oracle Product Manager for his help and support in putting this list together.

    Displaying Oracle Documents Cloud Services File Picker and Link Picker from other Domains

    $
    0
    0

    Introduction

    The Oracle Documents File Picker and Link Picker allow web applications to select files and folders that reside in the Oracle Documents Cloud Service. These Pickers can start from a specific folder and can further be restricted to a folder structure for a specific user and role when combined with the Oracle Documents App Link. Some of these integration calls are made from external domains therefore this embedded content transfer needs to be properly allowed and configured.

    Main Article

    The FilePicker tutorial allows the user to choose a List or a Grid Layout and the Sort Order. Some options include the ability to select a single item, to select folders only, to select files only, and to allow the upload of new files as shown below:

    File Picker Tutorial

     

    Most applications run on a different domain from that where the Oracle Documents Cloud Service (DOCS) instance is running. Therefore an additional configuration step is required to embed a DOCS web user inline frame interface:

    1. 1) Goto the Administration page on the source DOCS instance (http://hostname:port/documents/admin)
    2. 2) Goto the System-wide Settings tab
    3. 3) Goto the Embedded Content section and select YES
    4. 4) Add the target domain and the CAPTCHA/SafeMode options

    Further information is available in the document “Administering Oracle Documents Cloud Service” – section: “Displaying Content from Other Domains” in the link below:
    http://docs.oracle.com/cloud/latest/documentcs_welcome/WCCCA/GUID-6511347B-87ED-43D9-A183-BBD91E9E17C8.htm#WCCCA-GUID-6511347B-87ED-43D9-A183-BBD91E9E17C8
    The example below shows a Java Cloud Service (JCS) application invoking the DOCS File Picker from a different domain.

    First of all, the domain where the app is going to be invoking the File Picker needs to be added to the list of allowed domains:

    File Picker Embedded Content YES

    Note that CAPTCHA was enabled for the invoking domain. This will challenge users to perform a simple visual test. Therefore preventing automated scripts from reaching out to the DOCS instance.

    A simple application was created to call the DOCS File Picker from an HTML page. This application was deployed in JDeveloper so that the EAR file could be deployed in the Java Cloud Service.

    File Picker App in JDeveloper

    The application was deployed successfully in JCS:

    File Picker App deployed in JCS

    When running the App, the below simple visual challenge is issued since CAPTCHA was enabled for this target domain:

    File Picker CAPTCHA

    Note the two different domains from the application running on JCS and from the DOCS instance:

    File Picker App running on JCS

    After select the item to be picked, DOCS returns a JSON with data corresponding to the selection made:

    File Picker returns data

    The preClick mechanism allows options to be passed to the createFilePickerButton. For example the ID of the initial folder may not be available until page load. The File or Line Picker could create a button or a custom one could be used. Using the
    pre-configured button from DOCS is simpler but may not match UI requirements and a custom button would be desired.

    function onPreClick(options) {
    	
    	var foldId = //Application logic to obtain folder id  
    	option.id = getFolderId();
    };
    
    function onOk(selection) {
    	//Do something with the selection 	
    };
    	
    window.onload=function() {	
    	options = {
    	   preClick : onPreClick,
    	   ok : onOk,
    	   id: ""  // Id not yet known  
    	};
    
    	var button = OracleDCS.createFilePickerButton(options);
    	document.getElementById("button-container").appendChild(button);		
    };

    Both the File and Link Picker could be used in combination with AppLink to restrict access to a folder structure for a specified user and role. The Picker would need an appLinkID, an accessToken, and a refreshToken. These are returned by the createAppLink service.

    function onOk(selection) {
    	//Do something with the selection 	
    };
    	
    window.onload=function() {	
    
    	//Call application logic to get AppLink target folder id
    	var appLinkFolderId = getFolderId(); 
    
    	//Application logic to invoke create folder AppLink service with folder id
    	var createApplink = createAppLink(appLinkFolderId);
    
    	options = {
    		ok: onOk,
    		id:appLinkFolderId;
    		appLinkId: createApplink.appLinkID,
    		appLinkAccessToken: createApplink.accessToken,
    		appLinkRefreshToken: createApplink.refreshToken
    	};
    
    	var button = OracleDCS.createFilePickerButton(options);
    	document.getElementById("button-container").appendChild(button);		
    };

    The source for the above code is available at the Picker Tutorials:

    http://hostname:port/documents/static/api/FilePickerTutorial.html

    http://hostname:port/documents/static/api/LinkPickerTutorial.html

    The hostname and port variables identify the Oracle Documents Cloud Service instance.

    In summary, invoking domains need to be added to the list of allowed domains in the Embedded Content DOCS Administration section. If not present, the application would remain at the File Picker screen. It will not proceed with any action when clicking OK until the File/Link Picker section was cancelled and closed. The simple JDeveloper application called DOCS File Picker. However, the sample principles apply for the case of Link Picker.

     

    HCM Atom Feed Subscriber using SOA Cloud Service

    $
    0
    0

    Introduction

    HCM Atom feeds provide notifications of Oracle Fusion Human Capital Management (HCM) events and are tightly integrated with REST services. When an event occurs in Oracle Fusion HCM, the corresponding Atom feed is delivered automatically to the Atom server. The feed contains details of the REST resource on which the event occurred. Subscribers who consume these Atom feeds use the REST resources to retrieve additional information about the resource.

    For more information on Atom, please refer to this.

    This post focuses on consuming and processing HCM Atom feeds using Oracle Service Oriented Architecture (SOA) Cloud Service. Oracle SOA Cloud Service provides a PaaS computing platform solution for running Oracle SOA Suite, Oracle Service Bus, and Oracle API Manager in the cloud. For more information on SOA Cloud Service, please refer this.

    Oracle SOA is the industry’s most complete and unified application integration and SOA solution. It transforms complex application integration into agile and re-usable service-based connectivity to speed time to market, respond faster to business requirements, and lower costs.. SOA facilitates the development of enterprise applications as modular business web services that can be easily integrated and reused, creating a truly flexible, adaptable IT infrastructure.

    For more information on getting started with Oracle SOA, please refer this. For developing SOA applications using SOA Suite, please refer this.

     

    Main Article

    Atom feeds enable you to keep track of any changes made to feed-enabled resources in Oracle HCM Cloud. For any updates that may be of interest for downstream applications, such as new hire, terminations, employee transfers and promotions, Oracle HCM Cloud publishes Atom feeds. Your application will be able to read these feeds and take appropriate action.

    Atom Publishing Protocol (AtomPub) allows software applications to subscribe to changes that occur on REST resources through published feeds. Updates are published when changes occur to feed-enabled resources in Oracle HCM Cloud. These are the following primary Atom feeds:

    Employee Feeds

    New hire
    Termination
    Employee update

    Assignment creation, update, and end date

    Work Structures Feeds (Creation, update, and end date)

    Organizations
    Jobs
    Positions
    Grades
    Locations

    The above feeds can be consumed programmatically. In this post, Node.js is implemented as one of the solutions consuming “Employee New Hire” feeds, but design and development is similar for all the supported objects in HCM.

     

    HCM Atom Introduction

    For Atom “security, roles and privileges”, please refer my blog HCM Atom Feed Subscriber using Node.js.

     

    Atom Feed Response Template

     

    AtomFeedSample_1

    SOA Cloud Service Implementation

    Refer my blog on how to invoke secured REST services using SOA. The following diagram shows the patterns to subscribe to HCM Atom feeds and process it to downstream applications that may have either web services or file based interfaces. Optionally, all entries from the feeds could be staged either in database or messaging cloud before processing it during events such as downstream application is not available or throwing system errors. This provides the ability to consume the feeds, but hold the processing until downstream applications are available. Enterprise Scheduler Service (ESS), a component of SOA Suite, is leveraged to invoke the subscriber composite periodically.

     

    soacs_atom_pattern

    The following diagram shows the implementation of the above pattern for Employee New Hire:

    soacs_atom_composite

     

    Feed Invocation from SOA

    HCM cloud feed though in XML representation, the media type of the payload response is “application/atom+xml”. This media type is not supported at this time, but use the following java embedded activity in your BPEL component:

    Once the built-in REST Adapter supports the Atom media type, java embedded activity will be replaced and further simplify the solution.

    try {
    
    String url = "https://mycompany.oraclecloud.com";
    String lastEntryTS = (String)getVariableData("LastEntryTS");
    String uri = "/hcmCoreApi/atomservlet/employee/newhire";
    
    //Generate URI based on last entry timestamp from previous invocation
    if (!(lastEntryTS.isEmpty())) {
    uri = uri + "?updated-min=" + lastEntryTS;
    }
    
    java.net.URL obj = new URL(null,url+uri, new sun.net.www.protocol.https.Handler());
    
    javax.net.ssl.HttpsURLConnection conn = (HttpsURLConnection) obj.openConnection();
    conn.setRequestProperty("Content-Type", "application/vnd.oracle.adf.resource+json");
    conn.setDoOutput(true);
    conn.setRequestMethod("GET");
    
    String userpass = "username" + ":" + "password";
    String basicAuth = "Basic " + javax.xml.bind.DatatypeConverter.printBase64Binary(userpass.getBytes("UTF-8"));
    conn.setRequestProperty ("Authorization", basicAuth);
    
    String response="";
    int responseCode=conn.getResponseCode();
    System.out.println("Response Code is: " + responseCode);
    
    if (responseCode == HttpsURLConnection.HTTP_OK) {
    
    BufferedReader reader = new BufferedReader(new InputStreamReader(conn.getInputStream()));
    
    String line;
    String contents = "";
    
    while ((line = reader.readLine()) != null) {
    contents += line;
    }
    
    setVariableData("outputVariable", "payload", "/client:processResponse/client:result", contents);
    
    reader.close();
    
    }
    
    } catch (Exception e) {
    e.printStackTrace();
    }
    
    

     

    These are the following things to consider when consuming feeds:

    Initial Consumption

    When you subscribe first time, you can invoke the resource with the query parameters to get all the published feeds or use updated-min or updated-max arguments to filter entries in a feed to begin with.

    For example the invocation path could be /hcmCoreApi/Atomservlet/employee/newhire or /hcmCoreApi/Atomservlet/employee/newhire?updated-min=<some-timestamp>

    After the first consumption, the “updated” element of the first entry must be persisted to use it in next call to avoid duplication. In this prototype, the “/entry/updated” timestamp value is persisted in a database cloud (DbaaS).

    This is the sample database table

    create table atomsub (
    id number,
    feed_ts varchar2(100) );
    

    For initial consumption, keep the table empty or add a row with the value of feed_ts to consume initial feeds. For example, the feed_ts value could be “2015-09-16T09:16:00.000Z” to get all the feeds after this timestamp.

    In SOA composite, you will update the above table to persist the “/entry/updated” timestamp in the feed_ts column of the “atomsub” table.

     

    Next Call

    In next call, read the updated timestamp value from the database and generate the URI path as follows:

    String uri = "/hcmCoreApi/atomservlet/employee/newhire";
    String lastEntryTS = (String)getVariableData("LastEntryTS");
    if (!(lastEntryTS.isEmpty())) {
    uri = uri + "?updated-min=" + lastEntryTS;
    }
    

    The above step is done in java embedded activity, but it could be done in SOA using <assign> expressions.

    Parsing Atom Feed Response

    The Atom feed response is in XML format as shown previously in the diagram. In this prototype, the feed response is stored in output variable as a string. The following expression in <assign> activity will convert it to XML

    oraext:parseXML($outputVariable.payload/client:result)
    
    
    

    Parsing Each Atom Entry for Downstream Processing

    Each entry has two major elements as mentioned in Atom response payload structure.

    Resource Link

    This contains the REST employee resource link to get Employee object. This is a typical REST invocation from SOA using REST Adapter. For more information on invoking REST services from SOA, please refer my blog.

     

    Content Type

    This contains selected resource data in JSON format. For example: “{  “Context” : [ {    "EmployeeNumber" : "212",    "PersonId" : "300000006013981",    "EffectiveStartDate" : "2015-10-08",    "EffectiveDate" : "2015-10-08",    "WorkEmail" : "phil.davey@mycompany.com",    "EmployeeName" : "Davey, Phillip"  } ]}”.

    In order to use above data, it must be converted to XML. The BPEL component provides a Translator activity to transform JSON to XML. Please refer the SOA Development document, section B1.8 – doTranslateFromNative.

     

    The <Translate> activity syntax to convert above JSON string from <content> is as follows:

    <assign name="TranslateJSON">
    <bpelx:annotation>
    <bpelx:pattern>translate</bpelx:pattern>
    </bpelx:annotation>
    <copy>
     <from>ora:doTranslateFromNative(string($FeedVariable.payload/ns1:entry/ns1:content), 'Schemas/JsonToXml.xsd', 'Root-Element', 'DOM')</from>
     <to>$JsonToXml_OutputVar_1</to>
     </copy>
    </assign>
    

    This is the output:

    jsonToXmlOutput

    The following provides detailed steps on how to use Native Format Builder in JDeveloper:

    In native format builder, select JSON format and use above <content> as a sample to generate a schema. Please see the following diagrams:

    JSON_nxsd_1JSON_nxsd_2JSON_nxsd_3

    JSON_nxsd_5

     

    One and Only One Entry

    Each entry in an Atom feed has a unique ID. For example: <id>Atomservlet:newhire:EMP300000005960615</id>

    In target applications, this ID can be used as one of the keys or lookups to prevent reprocessing. The logic can be implemented in your downstream applications or in the integration space to avoid duplication.

     

    Scheduler and Downstream Processing

    Oracle Enterprise Scheduler Service (ESS) is configured to invoke the above composite periodically. At present, SOA cloud service is not provisioned with ESS, but refer this to extend your domain. Once the feed response message is parsed, you can process it to downstream applications based on your requirements or use cases. For guaranteed transactions, each feed entry can be published in Messaging cloud or Oracle Database to stage all the feeds. This will provide global transaction and recovery when downstream applications are not available or throws error.

    The following diagram shows how to create job definition for a SOA composite. For more information on ESS, please refer this.

    ess_3

    SOA Cloud Service Instance Flows

    First invocation without updated-min argument to get all the feeds

     

    soacs_atom_instance_json

    Atom Feed Response from above instance

    AtomFeedResponse_1

     

    Next invocation with updated-min argument based on last entry timestamp

    soacs_atom_instance_noentries

     

    Conclusion

    This post demonstrates how to consume HCM Atom feeds and process it for downstream applications. It provides details on how to consume new feeds (avoid duplication) since last polled. Finally it provides an enterprise integration pattern from consuming feeds to downstream applications processing.

     

    Sample Prototype Code

    The sample prototype code is available here.

     

    soacs_atom_composite_1

     

     
    

    Retrieving the OAM SessionID for Fun and Profit!

    $
    0
    0

    Introduction

    I recently worked with a customer who needed to do some OAM session manipulation via custom code in order to implement a complex use case. While the focus of this post is not to go into details about a specific implementation, I did want to share some advice on a very necessary building block needed to do “out of band” session manipulation: retrieving the OAM Session ID.

    What is the Session ID (used for)?

    OAM 11g supports the concept of a server-side session (unlike previous versions where the only session state was represented by a browser cookie) and this architecture allows for a far richer set of functionality, including the ability to manipulate the server-side session through the addition of attribute values that can be considered during the evaluation of an OAM policy. Each session stored in the server session store (shared across the cluster using Coherence) is identified by a unique GUID known as the Session ID – that’s the long number you see in the folllowing screenshot, taken from the OAM Admin Console:

    Session-ID

    The reason this identifier is useful, though, comes when you start to look at the API Docs for the OAM Access SDK, which is the component you’ll need to use in order to do things like session manipulation from custom code. Looking specifically at the UserSession class, you’ll note that several of the utility methods require that you pass the SessionID as argument; this is a mandatory step in order to obtain a reference to an existing session in order to manipulate it.

    Just a clarification at this point. Please do not interpret this post as a blanket endorsement of writing a custom Access Client as the solution to any and all problems. As always, work with your chosen OAM architect to carefully weigh up the pro’s and cons and various options available, with a strong preference for using out-of-the-box functionality, before concluding that custom code is the best way to solve your particular problem.

    Assuming we’ve complied with the above caveat, done the necessary homework and concluded that the custom Access Client solution is what we have to do, what we then need is a way to obtain the Session ID from an existing authenticated user session in order to pass it to our custom code.

    Using Identity Assertion to obtain the Session ID

    Step 1 here is (perhaps obviously) to ensure that we place a WebGate in front of our custom code in order to ensure that there is actually a session in place and that we can use an authorization policy response to transfer information to that code via headers (the usual OAM approach). Now, for a number of very good reasons, the Session ID (a sensitive piece of information that needs to be protected) is not available as a direct policy response, in the same way that the user id, profile attributes or session attributes would be. As is perhaps clear from reading the AccessSDK documentation, you can do a lot of harm with this SessionID in your hands and, as such, it behoves you as an organization to take appropriate steps to protect this data within your code and over your network.

    Treat the SessionID, in other words, just as you would a password. Do not write it into log files, do not send it over network links in clear text and take the necessary precautions to ensure that headers sent from your web tier to your app tier cannot be tampered with or spoofed. All the usual rules regarding safe and secure identity propagation apply, in other words.

    With caveats and good practice advice out of the way, let’s talk about how to get this magical nugget of info into your custom code. The answer is to enable Identity Propagation for the Authorization policy protecting the URL, as per the following screenshot.

    Assertion

    Once you do this, you will find that a SAML assertion is sent from the WebGate to your app in a header called “OAM_IDENTITY_ASSERTION”. There’s a lot of info inside this assertion, but you’re looking for the following snippet within the XML body.

    <saml:Assertion xmlns:saml=”urn:oasis:names:tc:SAML:2.0:assertion” xmlns:xsi=”http://www.w3.org/2001/XMLSchema-instance” xmlns:xs=”http://www.w3.org/2001/XMLSchema” Version=”2.0″ ID=”6714fd68-596c-4cfa-af61-91b43a5ecd2a” IssueInstant=”2015-10-23T09:34:23Z”>
    <saml:Issuer>OAM User Assertion Issuer</saml:Issuer>
    …..
    <saml:AttributeStatement>
    <saml:Attribute NameFormat=”urn:oasis:names:tc:SAML:2.0:attrname-format:uri” Name=”urn:oasis:names:tc:SAML:2.0:profiles:session:sessionId”>
    <saml:AttributeValue xsi:type=”xs:string”>d8ccd738-522f-4700-a91c-fd630b70ff61|S+kAgs+tqO+Rblq6abFwllAo5J4=</saml:AttributeValue>
    </saml:Attribute>
    ……
    </saml:AttributeStatement>
    </saml:Assertion>

    Right there, in the bolded text, is your Session ID. Use it wisely – keep it secret, keep it safe.

    Viewing all 987 articles
    Browse latest View live