Topics for 12hr investigation/blog welcome. Java/PLSQL/DBA

New content being created.

No longer at Oracle.

Suggested post: 

minor Database docker/free tier log.

For:

CV/portfolio/interview discussion.

Topics for 12hr investigation/blog welcome.

Available for immediate employment.

(1hr commute from Beaumont, Dublin 9, Ireland. (North Dublin, 30mins from city centre))

https://ie.linkedin.com/in/turlochotierney

Senior PLSQL/Java/Oracle Database Professional at Oracle Dublin. 30 years expertise. PLSQL | Java | Oracle Database | Linux | git | 11g DBA OCP | Gitlab | C |  macOS


Comments

  1. Log: I set up orace23ai free arm podman based on: https://ronekins.com/2025/06/13/run-oracle-database-23ai-free-on-arm-based-macbooks-with-podman/?amp=1

    ReplyDelete
  2. Autonomous ATP created: No configuration choices on note.

    Working with wallet/autonomous:
    in podman:
    sqlplus (ie thick/oci/'C')
    sqlcl (thick or thin)

    Native macos:
    sqlplus

    On web:
    Database actions worksheet.

    Not working:
    macos:
    sqldeveloper java/vscode/sqlcl:
    with thick (Version mismatch C/JDBC) or thin wallet (Likely crypto choice clash)
    (sql workaround - exec to podman i.. Linux/small script to automate

    ReplyDelete
  3. Autonomous: schema export and import (could be scripted periodic - starts to cost at certain data amounts)
    Works with free tier - yes
    Client side software - sqlcl (commands : PLSQL datapump and oci)
    Will slow you down - finding your url for your path for file on block storage
    --
    Nothing too research like except: blogs, doc, Oracle Cloud, and sqlcl help.
    --
    created oci contact file including private key
    created test schema
    created credential
    (bucket already created)
    exported to DATA_PUMP_DIR
    copied to bucket
    downloaded to (podman) dqlcl oci command
    Changed name
    uploaded to bucket
    transferred to data_pump_dir
    dropped schema
    created sceha login
    imported dump file.

    Of note on client: sqlcl and openssl only required (Used arm docker/podman arm database release)
    (Information all available tiny bit of work figuring out 'your' url for bucket.)
    (Not done but data_pump can internally do the copy to cloud on autonomous, ,if credential supplied)
    TZ file is equal or upwards only.
    Relies on autonomous dbms_cloud for DATA_PUMP_DIR to from bucket.
    Tiny dump file: issues did not encrypt or compress/just in case license etc (the datapump command or package has these options)/There appears to be a dump file gets split up problem:
    there is a -fs setting. To give split dump file preference.
    Did not clean up/delete dump filr on block/DATA_PUMP_DIR yet.
    DBMS_CLOUD has a list files in directory feature.

    ReplyDelete
  4. This comment has been removed by the author.

    ReplyDelete
  5. /* formatting lost reformat in PLSQL editor ampasand near THEHOST may confuse! */

    DEFINE THEHOST=xxx.com

    /**
    Tiny bit of fetch/MLE javascript in oracl23ai.
    */
    set serveroutput on

    CREATE TABLE mytableclob (
    id NUMBER PRIMARY KEY,
    clob_column CLOB
    );


    DECLARE
    l_ctx dbms_mle.context_handle_t;
    l_snippet CLOB;
    l_result CLOB;
    js_code CLOB := q'[
    async function fetchData() {
    try {
    await import('mle-js-fetch');
    let response = await fetch('http://&THEHOST');
    if (!response.ok) {
    return `http error: ${response.status}`;
    }
    const data = await response.text();
    return data;
    } catch (err) {
    return `err error: ${err.message}`;
    }
    }
    fetchData();
    ]';
    result_clob CLOB;
    BEGIN
    DBMS_LOB.createtemporary(result_clob, TRUE);
    l_ctx := dbms_mle.create_context();
    DBMS_MLE.EVAL(language_id=>'JAVASCRIPT',source=>js_code,context_handle=>l_ctx,result=>result_clob);
    l_result := result_clob;

    dbms_mle.drop_context(l_ctx);
    INSERT INTO mytableclob (id, clob_column)
    VALUES (1, l_result);
    COMMIT;
    DBMS_LOB.FREETEMPORARY(result_clob);
    EXCEPTION
    WHEN OTHERS THEN
    BEGIN
    IF l_ctx IS NOT NULL THEN
    DBMS_MLE.DROP_CONTEXT(l_ctx);
    END IF;
    EXCEPTION
    WHEN OTHERS THEN NULL;
    END;
    BEGIN
    --IF (DBMS_LOB.IS_TEMPORARY(result_clob) = 1) THEN
    DBMS_LOB.FREETEMPORARY(result_clob);
    --END IF;
    EXCEPTION
    WHEN OTHERS THEN NULL;
    END;
    RAISE;
    END;
    /
    SELECT DBMS_LOB.SUBSTR(clob_column, 1000, 1) AS preview
    FROM mytableclob
    WHERE id = 1;
    delete from mytableclob where id=1;
    commit;
    --desc dbms_mle


    /**
    Packages functions an types may vay between versions of oracleai - this is a macos (not lite) docker pn podman!
    BANNER BANNER_FULL BANNER_LEGACY CON_ID
    --------------------------------------------------------------------------------------------------------------------------------- ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ --------------------------------------------------------------------------------------------------------------------------------- ----------
    Oracle Database 23ai Free Release 23.0.0.0.0 - Develop, Learn, and Run for Free Oracle Database 23ai Free Release 23.0.0.0.0 - Develop, Learn, and Run for Free Oracle Database 23ai Free Release 23.0.0.0.0 - Develop, Learn, and Run for Free
    Version 23.8.0.25.04


    setup as sys:

    grant dba to totierne identified by xxxx;
    grant create mle to totierne;
    GRANT EXECUTE DYNAMIC MLE on totierne;
    grant EXECUTE ON JAVASCRIPT TO totierne;
    GRANT EXECUTE ON DBMS_LOB TO totierne

    ACL:

    BEGIN
    DBMS_NETWORK_ACL_ADMIN.append_host_ace(
    host => '&THEHOST',
    ace => xs$ace_type(
    privilege_list => xs$name_list('connect'),
    principal_name => 'totierne',
    principal_type => xs_acl.ptype_db
    )
    );
    END;
    /


    With help from Oracle Docs Blogs and Chatgpt...

    issues: returning long text-> clob
    dbms_mle.eval strangeness (different chatgpt output/Oracle23ai dot version?)
    is_temporary missing

    */

    ReplyDelete
  6. Next Project: Call out to openAI and/or Local LLM (Ollama)using PLSQL or MLE Javascript.

    ReplyDelete
    Replies
    1. Used Local LLM (Ollama/Lama3)using PLSQL. Slightly tricky laborious - use clob and keep to char boundaries when passing throught ollama/llama3 in chunks. (ie dont break multibyte (even eur symbol) non 8bit data). Database is byte semantics.

      Cheat: summary of summaries I limited to within 4000 char - rather than an extra summary stage.

      Academic: For Further Research:
      -Process on openAI/Claude...
      -Senior Developer suggested there is an adaptor to plug in a variety of AI back ends.

      Could have done better faster: change control/git.

      Delete
    2. Used project command to dump my PLSQL to mac main disk outside datebase.
      Next mini blogable talk at interview Project: PLSQL/Java/AWS/Oracle Database 26 beta program/Python... Learn Study Remind a lot vs: pick a new niche.

      Delete
  7. Back March 2026. I do not want to over spec in advance. In summary: prepare for interview and AI.

    ReplyDelete
  8. I have done: SQL Server/Microsoft_Sybase to Oracle including TSQL to PLSQL - Specing out some aspect of Oracle to Postgres (or hybrid to avoid more costly Oracle options - and still have c suite friendly Oracle source of truth/mirroring). Oracle was reluctant to get data out of Oracle Database? Maybe that is also a niche - Kafka seems to be the buzzword. Java Message Queues (RabbitMQ in Erlang?) - was the buzzword 15years ago - but no one gets a bonus for using Java these days - To Postgres - or 'Stream/Queue etc for other consumption'. (Chatgpt comment: Goldengate (basically cool fast streaming) seems to Require Oracle Database Enterprise Edition.).

    ReplyDelete
    Replies
    1. Also ETL... getting data streams in / out of the database - 'even' sqlldr - seems to be a hot interview question.

      Delete

Post a Comment

Popular posts from this blog

CVs from google drive:

Training for my 18 year old