Skip to content
This repository has been archived by the owner on Dec 29, 2020. It is now read-only.

failed to create session #20

Open
LeArmadillo opened this issue Oct 18, 2016 · 13 comments
Open

failed to create session #20

LeArmadillo opened this issue Oct 18, 2016 · 13 comments

Comments

@LeArmadillo
Copy link

LeArmadillo commented Oct 18, 2016

Hi,

I'm having trouble establishing a connection to a checkpoint CLM to pull logs using fw1-loggrabber. I'm getting a very generic error when using the default lea_server auth_type sslca of "ERROR: failed to initialize client/server-pair (NO Error)" debugging shows it to be the pServer object which fails to initialize (fw1-loggrabber.c line 1761).

When I change lea_server auth_type to ssl_opsec or auth_opsec I get the error "ERROR: failed to create session (Argument is NULL or lacks some data)" debugging shows it to be the pSession objects which fails to initialize (fw1-loggrabber.c line 1773).

In both cases I feel that the error may have something to do with the following output lines:

[ 30697 4141156064]@opchs02cloudDN[18 Oct 12:57:26] opsec_read_cert_file: could not open file: "/home/Bluesky/fw1-loggrabber/fw1-loggrabber/opsec.p12"
[ 30697 4141156064]@opchs02cloudDN[18 Oct 12:57:26] opsec_init_sslca: failed to read cert file

Is it possible to get more descriptive errors as I'm not really understanding why the client/server link is failing to initialize. Is there a way to rectify the failure to read the opsec.p12 file? As this file has the credentials for the CLM it makes sense that this is causing the connection to fail. Is the "failed to create session" error further along than the "failed to initialize client/server-pair" error? I will be very grateful for any advice offered, my apologies if these are basic/obvious questions but I'm a bit new to this :)

lea.conf:

lea_server auth_type sslca
lea_server ip 62.***.***.126
lea_server auth_port 18184
lea_server port 18184
opsec_sic_name "CN=LogGrabberOPSEC,O=******"
opsec_sslca_file "/home/Bluesky/fw1-loggrabber/fw1-loggrabber/opsec.p12"
lea_server opsec_entity_sic_name "cn=cp_mgmt,o=******"

fw1-loggrabber.conf:

# DEBUG_LEVEL=<debuglevel>
DEBUG_LEVEL="1"

# FW1_LOGFILE=<Name of FW1-Logfilename>
FW1_LOGFILE="fw.log"

# FW1_OUTPUT=<files|logs>
FW1_OUTPUT="logs"

# FW1_TYPE=<ng|2000>
FW1_TYPE="ng"

# FW1_MODE=<audit|normal>
FW1_MODE="normal"

# ONLINE_MODE=<yes|no>
ONLINE_MODE="yes"

# RESOLVE_MODE=<yes|no>
RESOLVE_MODE="no"

# SHOW_FIELDNAMES=<yes|no>
SHOW_FIELDNAMES="yes"

# RECORD_SEPARATOR=<char>
RECORD_SEPARATOR="|"

# DATEFORMAT=<cp|unix|std>
#   cp   = " 3Feb2004 14:15:16"
#   unix = "1051655431"
#   std  = "2004-02-03 14:15:16"
DATEFORMAT="std"

# IGNORE_FIELDS=<field1;field2;...>
#IGNORE_FIELDS="uuid;__policy_id_tag"

# LOGGING_CONFIGURATION=<screen|file|syslog>
LOGGING_CONFIGURATION=file

# OUTPUT_FILE_PREFIX=<Path and Name of outputfile>
TPUT_FILE_PREFIX="fw1-loggrabber"

# SYSLOG_FACILITY=<USER|LOCAL0|...|LOCAL7>
SYSLOG_FACILITY="LOCAL1"

full debug trace:

[root@opchs02cloudDN fw1-loggrabber]# ./fw1-loggrabber -l lea.conf -c fw1-loggrabber.conf
WARNING: You specified a relative path for opsec_sslca_file in
         /home/Bluesky/fw1-loggrabber/fw1-loggrabber/lea.conf. When not using an
         absolute path, the certificate will be searched in
         $LOGGRABBER_TEMP_PATH or in current working.
         directory if $LOGGRABBER_TEMP_PATH is not set.
WARNING: Illegal entry in configuration file: FW1_OUTPUT=logs"
WARNING: Illegal entry in configuration file: FW1_TYPE=ng"
WARNING: Illegal entry in configuration file: FW1_MODE=normal"
WARNING: Illegal entry in configuration file: ONLINE_MODE=yes"
WARNING: Illegal entry in configuration file: RESOLVE_MODE=no"
WARNING: Illegal entry in configuration file: SHOW_FIELDNAMES="yes"
WARNING: Illegal entry in configuration file: DATEFORMAT=std"
WARNING: Illegal entry in configuration file: LOGGING_CONFIGURATION=file
WARNING: Illegal entry in configuration file: SYSLOG_FACILITY=LOCAL1"
WARNING: Illegal entry in configuration file: SYSLOG_FACILITY=LOCAL1"
DEBUG: Open connection to screen.
DEBUG: Logfilename      : fw.log"
DEBUG: Record Separator : |
DEBUG: Resolve Addresses: Yes
DEBUG: Show Filenames   : No
DEBUG: FW1-2000         : No
DEBUG: Online-Mode      : No
DEBUG: Audit-Log        : No
[ 23804 4140586720]@opchs02cloudDN[18 Oct 11:41:55] Env Configuration:
""
 :type (opsec_info)
 :lea_server (
"):auth_type ("sslca
"):ip ("62.***.***.126
"):auth_port ("18184
"):port ("18184
"):opsec_entity_sic_name ("'cn=cp_mgmt,o=******'
 )
")opsec_sic_name ("'CN=LogGrabberOPSEC,O=******'
")opsec_sslca_file ("'/home/Bluesky/fw1-loggrabber/fw1-loggrabber/opsec.p12'
)

[ 23804 4140586720]@opchs02cloudDN[18 Oct 11:41:55] opsec_initdir: opsec dir already initialized to: /tmp
[ 23804 4140586720]@opchs02cloudDN[18 Oct 11:41:55] Could not find info for ...opsec_shared_local_path...
[ 23804 4140586720]@opchs02cloudDN[18 Oct 11:41:55] Could not find info for ...opsec_sic_policy_file...
[ 23804 4140586720]@opchs02cloudDN[18 Oct 11:41:55] Could not find info for ...opsec_mt...
[ 23804 4140586720]@opchs02cloudDN[18 Oct 11:41:55] opsec_init: multithread safety is not initialized
[ 23804 4140586720]@opchs02cloudDN[18 Oct 11:41:55] cpprng_opsec_initialize: path is not initialized - will initialize
[ 23804 4140586720]@opchs02cloudDN[18 Oct 11:41:55] cpprng_opsec_initialize: full file name is /tmp/ops_prng

[ 23804 4140586720]@opchs02cloudDN[18 Oct 11:42:00] cpprng_opsec_initialize: dev_urandom_poll returned 0
[ 23804 4140586720]@opchs02cloudDN[18 Oct 11:42:00] opsec_file_is_intialized: seed is initialized
[ 23804 4140586720]@opchs02cloudDN[18 Oct 11:42:00] cpprng_opsec_initialize: seed init for opsec succeeded
[ 23804 4140586720]@opchs02cloudDN[18 Oct 11:42:00] PM_policy_create: version 5301.
[ 23804 4140586720]@opchs02cloudDN[18 Oct 11:42:00] PM_policy_add_name_to_group: finished successfully.
[ 23804 4140586720]@opchs02cloudDN[18 Oct 11:42:00] PM_policy_set_local_names: () names. finished successfully.
[ 23804 4140586720]@opchs02cloudDN[18 Oct 11:42:00] PM_policy_create: finished successfully.
[ 23804 4140586720]@opchs02cloudDN[18 Oct 11:42:00] PM_policy_add_name_to_group: finished successfully.
[ 23804 4140586720]@opchs02cloudDN[18 Oct 11:42:00] PM_policy_set_local_names: (local_sic_name) names. finished successfully.
[ 23804 4140586720]@opchs02cloudDN[18 Oct 11:42:00] PM_policy_add_name_to_group: finished successfully.
[ 23804 4140586720]@opchs02cloudDN[18 Oct 11:42:00] PM_policy_set_local_names: (127.0.0.1) names. finished successfully.
[ 23804 4140586720]@opchs02cloudDN[18 Oct 11:42:00] PM_policy_add_name_to_group: finished successfully.
[ 23804 4140586720]@opchs02cloudDN[18 Oct 11:42:00] PM_policy_set_local_names: (""CN=LogGrabberOPSEC,O=******") names. finished successfully.
[ 23804 4140586720]@opchs02cloudDN[18 Oct 11:42:00] PM_apply_default_dn: finished successfully.
[ 23804 4140586720]@opchs02cloudDN[18 Oct 11:42:00] opsec_read_cert_file: could not open file: "/home/Bluesky/fw1-loggrabber/fw1-loggrabber/opsec.p12"
[ 23804 4140586720]@opchs02cloudDN[18 Oct 11:42:00] opsec_init_sslca: failed to read cert file
[ 23804 4140586720]@opchs02cloudDN[18 Oct 11:42:00] opsec_init_sic_id_internal: Added sic id (ctx id = 0)
DEBUG: OPSEC LEA conf file is /home/Bluesky/fw1-loggrabber/fw1-loggrabber/lea.conf
DEBUG: Authentication mode has been used.
DEBUG: Server-IP     : 62.***.***.126
DEBUG: Server-Port     : 18184
DEBUG: Authentication type: sslca
DEBUG: OPSEC sic certificate file name : "/home/Bluesky/fw1-loggrabber/fw1-loggrabber/opsec.p12"
DEBUG: Server DN (sic name) : "cn=cp_mgmt,o=******"
DEBUG: OPSEC LEA client DN (sic name) : "CN=LogGrabberOPSEC,O=******"
[ 23804 4140586720]@opchs02cloudDN[18 Oct 11:42:00] opsec_init_entity_sic: called for the client side
[ 23804 4140586720]@opchs02cloudDN[18 Oct 11:42:00] Configuring entity lea_server
[ 23804 4140586720]@opchs02cloudDN[18 Oct 11:42:00] Could not find info for ...conn_buf_size...
[ 23804 4140586720]@opchs02cloudDN[18 Oct 11:42:00] Could not find info for ...no_nagle...
[ 23804 4140586720]@opchs02cloudDN[18 Oct 11:42:00] opsec_config_entity: unknown auth type
[ 23804 4140586720]@opchs02cloudDN[18 Oct 11:42:00] Destroying entity 2 with 0 active comms
[ 23804 4140586720]@opchs02cloudDN[18 Oct 11:42:00] opsec_destroy_entity_sic: deleting sic rules for entity 0x81bb168
ERROR: failed to initialize client/server-pair (NO Error)
[ 23804 4140586720]@opchs02cloudDN[18 Oct 11:42:00] Destroying entity 1 with 0 active comms
[ 23804 4140586720]@opchs02cloudDN[18 Oct 11:42:00] opsec_destroy_entity_sic: deleting sic rules for entity 0x81b9ff8
[ 23804 4140586720]@opchs02cloudDN[18 Oct 11:42:00] IpcUnMapFile: unmapping file (handle=0x81b9618)
[ 23804 4140586720]@opchs02cloudDN[18 Oct 11:42:00] IpcUnMapFile: unmapping file (handle=0x81b9df0)
[ 23804 4140586720]@opchs02cloudDN[18 Oct 11:42:00] IpcUnMapFile: unmapping file (handle=0x81b9e70)
[ 23804 4140586720]@opchs02cloudDN[18 Oct 11:42:00] IpcUnMapFile: unmapping file (handle=0x81b9f18)
[ 23804 4140586720]@opchs02cloudDN[18 Oct 11:42:00] IpcUnMapFile: unmapping file (handle=0x81b9fa8)
[ 23804 4140586720]@opchs02cloudDN[18 Oct 11:42:00] PM_policy_destroy: finished successfully.
[ 23804 4140586720]@opchs02cloudDN[18 Oct 11:42:00] opsec_destroy_sic_id_internal: Destroyed sic id (ctx id=0)
[ 23804 4140586720]@opchs02cloudDN[18 Oct 11:42:00] opsec_env_destroy_sic_id_hash: Destroyed sic id hash
[ 23804 4140586720]@opchs02cloudDN[18 Oct 11:42:00] fwd_env_destroy: env 0x81b4d08 (alloced = 1)
[ 23804 4140586720]@opchs02cloudDN[18 Oct 11:42:00] T_env_destroy: env 0x81b4d08
[ 23804 4140586720]@opchs02cloudDN[18 Oct 11:42:00] do_fwd_env_destroy:  really destroy 0x81b4d08
@adepasquale
Copy link
Contributor

Hi @LeArmadillo,

This made me think:

WARNING: You specified a relative path for opsec_sslca_file in
         /home/Bluesky/fw1-loggrabber/fw1-loggrabber/lea.conf. When not using an
         absolute path, the certificate will be searched in
         $LOGGRABBER_TEMP_PATH or in current working.
         directory if $LOGGRABBER_TEMP_PATH is not set.

You probably should change this line in your lea.conf from this:

opsec_sslca_file "/home/Bluesky/fw1-loggrabber/fw1-loggrabber/opsec.p12"

to this (removing the double quotes):

opsec_sslca_file /home/Bluesky/fw1-loggrabber/fw1-loggrabber/opsec.p12

@LeArmadillo
Copy link
Author

LeArmadillo commented Oct 18, 2016

Hi adepasquale,

Thank you for your response,

I've played around with the file-path for opsec.p12 a few times with no success. However removing the "" from the opsec_sic_name & lea_server opsec_entity_sic_name paths in lea.conf did change the output.
Additionally I changed the permissions on the home directory to write.
Below is what I'm now getting as an output, however the root cause of failure still seems to be opsec_read_cert_file failing to read:
[ 11648 4140824288]@opchs02cloudDN[18 Oct 15:22:40] opsec_read_cert_file: could not open file: /home/Bluesky/fw1-loggrabber/fw1-loggrabber/opsec.p12
[ 11648 4140824288]@opchs02cloudDN[18 Oct 15:22:40] opsec_init_sslca: failed to read cert file
Do you have any suggestion as to why fw1-loggraber still can't read the file?

FULL DEBUG OUTPUT:

[root@opchs02cloudDN fw1-loggrabber]# ./fw1-loggrabber -c fw1-loggrabber.conf -l lea.conf
WARNING: Illegal entry in configuration file: FW1_OUTPUT=logs"
WARNING: Illegal entry in configuration file: FW1_TYPE=ng"
WARNING: Illegal entry in configuration file: FW1_MODE=normal"
WARNING: Illegal entry in configuration file: ONLINE_MODE=yes"
WARNING: Illegal entry in configuration file: RESOLVE_MODE=no"
WARNING: Illegal entry in configuration file: SHOW_FIELDNAMES="yes"
WARNING: Illegal entry in configuration file: DATEFORMAT=std"
WARNING: Illegal entry in configuration file: LOGGING_CONFIGURATION=file
WARNING: Illegal entry in configuration file: OTPUT_FILE_PREFIX="fw1-loggrabber"
WARNING: Illegal entry in configuration file: SYSLOG_FACILITY=LOCAL1"
DEBUG: Open connection to screen.
DEBUG: Logfilename : fw.log"
DEBUG: Record Separator : |
DEBUG: Resolve Addresses: Yes
DEBUG: Show Filenames : No
DEBUG: FW1-2000 : No
DEBUG: Online-Mode : No
DEBUG: Audit-Log : No
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:47] Env Configuration:
""
:type (opsec_info)
:lea_server (
:auth_type (
") :ssl_opsec ("
)
") :ip ("62._._.126
") :auth_port ("18184
") :port ("18184
") :opsec_entity_sic_name ("cn=cp_mgmt,o=******
)
") :opsec_sic_name ("CN=LogGrabberOPSEC,O=******
") :opsec_sslca_file ("/home/Bluesky/fw1-loggrabber/fw1-loggrabber/opsec.p12
)

[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:47] opsec_initdir: opsec dir already initialized to: /tmp
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:47] Could not find info for ...opsec_shared_local_path...
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:47] Could not find info for ...opsec_sic_policy_file...
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:47] Could not find info for ...opsec_mt...
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:47] opsec_init: multithread safety is not initialized
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:47] cpprng_opsec_initialize: path is not initialized - will initialize
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:47] cpprng_opsec_initialize: full file name is /tmp/ops_prng
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] cpprng_opsec_initialize: dev_urandom_poll returned 0
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] opsec_file_is_intialized: seed is initialized
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] cpprng_opsec_initialize: seed init for opsec succeeded
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] PM_policy_create: version 5301.
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] PM_policy_add_name_to_group: finished successfully.
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] PM_policy_set_local_names: () names. finished successfully.
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] PM_policy_create: finished successfully.
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] PM_policy_add_name_to_group: finished successfully.
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] PM_policy_set_local_names: (local_sic_name) names. finished successfully.
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] PM_policy_add_name_to_group: finished successfully.
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] PM_policy_set_local_names: (127.0.0.1) names. finished successfully.
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] PM_policy_add_name_to_group: finished successfully.
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] PM_policy_set_local_names: ("CN=LogGrabberOPSEC,O=_") names. finished successfully.
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] CkpRegDir: Environment variable CPDIR is not set.
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] GenerateGlobalEntry: Unable to get registry path
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] get_bc_ds_choiceID: Failed to open registry; using default
].10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] PM_apply_default_dn: ca_dn = [O=_

[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] PM_apply_default_dn: calling PM_policy_DN_conversion ..
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] PM_apply_default_dn: finished successfully.
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] opsec_read_cert_file: could not open file: /home/Bluesky/fw1-loggrabber/fw1-loggrabber/opsec.p12
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] opsec_init_sslca: failed to read cert file
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] opsec_init_sic_id_internal: Added sic id (ctx id = 0)
DEBUG: OPSEC LEA conf file is /home/Bluesky/fw1-loggrabber/fw1-loggrabber/lea.conf
DEBUG: Clear text mode has been used.
DEBUG: Server-IP : 62._._.126
DEBUG: Server-Port : 18184
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] opsec_init_entity_sic: called for the client side
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] Configuring entity lea_server
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] Could not find info for ...conn_buf_size...
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] Could not find info for ...no_nagle...
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] opsec_init_entity_sic: Authentication not initialized...
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] opsec_init_entity_sic: adding default rule
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] opsec_entity_add_sic_rule: adding rules: apply_to: ANY, peer: ANY, d_ip: ANY, dport ANY, svc: lea, method: fwn1
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] opsec_entity_add_sic_rule: adding INBOUND rule
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] opsec_entity_add_sic_rule: adding OUTBOUND (IP) rule
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] opsec_entity_add_sic_rule: adding OUTBOUND rule
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] opsec_get_comm: creating comm for ent=9964080 peer=99651f0 passive=0 key=2 info=0
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] c=0x9964080 s=0x99651f0 comm_type=4

[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] Could not find info for ...opsec_client...
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] opsec_get_comm: Creating session hash (size=256)
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] opsec_get_comm: ADDING comm=0x9967758 to ent=0x9964080 with key=2
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] opsec_env_get_context_id_by_peer_sic_name: found context id=0 for peer sic name=cn=cp_mgmt,o=******
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] opsec_env_get_sic_handle_by_context_id: found sic handle (ctx id=0)
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] opsec_sic_connect: connecting... (ctx id=0)
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] fw_do_get_all_ipaddrs: called. naddrs=32769

[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] resolver_getaddrinfo_list: name=opchs02cloudDN, pref=0
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] resolver_getaddrinfo_list: found peer 0 192.168.10.5
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] fw_do_get_all_ipaddrs: fw_ipaddr_both returned 192.168.10.5 ::

[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] fw_do_get_all_ipaddrs: found 0 addresses

[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] sic_init_myaddr_ex: could not get my own IPv6 addresses.
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] resolver_gethostbyname: Performing gethostbyname for opchs02cloudDN
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] peers addresses are
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] 192.168.10.5
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] CkpRegDir: Environment variable CPDIR is not set.
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] GenerateGlobalEntry: Unable to get registry path
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] cpsicdemux_get_mode: the mode is 1
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] cpsicdemux_check_mode: server_mode=1 | requested_mode=1
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] fwasync_get_maxbuf: maxbuf=4194304
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] T_event_epoll_report: EPOLL API disabled; SELECT is used
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] SESSION ID:3 is sending DG_TYPE=1

[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] pushing dgtype=1 len=0 to list=0x9967774
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] SESSION ID:3 is sending DG_TYPE=402

[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] pushing dgtype=402 len=27 to list=0x9967774
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] fwasync_connected: 12: getpeername: Transport endpoint is not connected
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] opsec_auth_client_connected: connection to server failed.
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] opsec_auth_client_connected:conn=(nil) opaque=0x99663e8 err=0 comm=0x9967758
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] comm failed to connect 0x9967758
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] OPSEC_SET_ERRNO: err = 8 Comm is not connected/Unable to connect (pre = 0)
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] COM 0x9967758 got signal 131075
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] destroying comm 0x9967758
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] Destroying comm 0x9967758 with 1 active sessions
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] Destroying session (996a790) id 3 (ent=9964080) reason=COMM_IS_DEAD
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] SESSION ID:3 is sending DG_TYPE=3

DEBUG: OPSEC_SESSION_END_HANDLER called
ERROR: No communication.
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] opsec_del_event : event ctx is not activated

[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] opsec_del_event : event ctx is not activated

[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] opsec_del_event : event ctx is not activated

[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] opsec_comm_is_needed:comm 0x9967758 1/1 sessions need the comm.
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] pulling dgtype=1 len=0 to list=0x9967774
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] pulling dgtype=402 len=27 to list=0x9967774
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] pulling dgtype=ffffffff len=-1 to list=0x9967774
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] REMOVING comm=0x9967758 from ent=0x9964080 with key=2
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] sic_client_connected: SIC error - Client could not connect to server
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] fwasync_do_end_conn: 12: calling 0xf7631830 to free opaque 0x9967c30
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] T_event_mainloop_e: T_event_mainloop_iter returns 0
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] Destroying entity 1 with 0 active comms
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] opsec_destroy_entity_sic: deleting sic rules for entity 0x9964080
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] Destroying entity 2 with 0 active comms
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] opsec_destroy_entity_sic: deleting sic rules for entity 0x99651f0
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] IpcUnMapFile: unmapping file (handle=0x99636f8)
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] IpcUnMapFile: unmapping file (handle=0x9963e78)
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] IpcUnMapFile: unmapping file (handle=0x9963ef8)
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] IpcUnMapFile: unmapping file (handle=0x9963fa0)
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] IpcUnMapFile: unmapping file (handle=0x9964030)
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] PM_policy_destroy: finished successfully.
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] opsec_destroy_sic_id_internal: Destroyed sic id (ctx id=0)
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] opsec_env_destroy_sic_id_hash: Destroyed sic id hash
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] fwd_env_destroy: env 0x995ed08 (alloced = 1)
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] T_env_destroy: env 0x995ed08
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] do_fwd_env_destroy: really destroy 0x995ed08
DEBUG: Processing Logfile: fw.log"
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] Env Configuration:
""
:type (opsec_info)
:lea_server (
:auth_type (
") :ssl_opsec ("
)
") :ip ("62._._.126
") :auth_port ("18184
") :port ("18184
") :opsec_entity_sic_name ("cn=cp_mgmt,o=******
)
") :opsec_sic_name ("CN=LogGrabberOPSEC,O=******
") :opsec_sslca_file ("/home/Bluesky/fw1-loggrabber/fw1-loggrabber/opsec.p12
)

[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] opsec_initdir: opsec dir already initialized to: /tmp
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] Could not find info for ...opsec_shared_local_path...
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] Could not find info for ...opsec_sic_policy_file...
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] Could not find info for ...opsec_mt...
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:12:55] opsec_init: multithread safety is not initialized
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] cpprng_opsec_initialize: dev_urandom_poll returned 0
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] opsec_file_is_intialized: seed is initialized
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] cpprng_opsec_initialize: seed init for opsec succeeded
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] PM_policy_create: version 5301.
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] PM_policy_add_name_to_group: finished successfully.
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] PM_policy_set_local_names: () names. finished successfully.
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] PM_policy_create: finished successfully.
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] PM_policy_add_name_to_group: finished successfully.
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] PM_policy_set_local_names: (local_sic_name) names. finished successfully.
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] PM_policy_add_name_to_group: finished successfully.
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] PM_policy_set_local_names: (127.0.0.1) names. finished successfully.
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] PM_policy_add_name_to_group: finished successfully.
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] PM_policy_set_local_names: ("CN=LogGrabberOPSEC,O=_") names. finished successfully.
].10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] PM_apply_default_dn: ca_dn = [O=_

[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] PM_apply_default_dn: calling PM_policy_DN_conversion ..
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] PM_apply_default_dn: finished successfully.
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] opsec_read_cert_file: could not open file: /home/Bluesky/fw1-loggrabber/fw1-loggrabber/opsec.p12
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] opsec_init_sslca: failed to read cert file
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] opsec_init_sic_id_internal: Added sic id (ctx id = 0)
DEBUG: OPSEC LEA conf file is /home/Bluesky/fw1-loggrabber/fw1-loggrabber/lea.conf
DEBUG: Clear text mode has been used.
DEBUG: Server-IP : 62._._.126
DEBUG: Server-Port : 18184
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] opsec_init_entity_sic: called for the client side
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] Configuring entity lea_server
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] Could not find info for ...conn_buf_size...
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] Could not find info for ...no_nagle...
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] opsec_init_entity_sic: Authentication not initialized...
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] opsec_init_entity_sic: adding default rule
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] opsec_entity_add_sic_rule: adding rules: apply_to: ANY, peer: ANY, d_ip: ANY, dport ANY, svc: lea, method: fwn1
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] opsec_entity_add_sic_rule: adding INBOUND rule
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] opsec_entity_add_sic_rule: adding OUTBOUND (IP) rule
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] opsec_entity_add_sic_rule: adding OUTBOUND rule
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] opsec_get_comm: creating comm for ent=9962268 peer=9965e98 passive=0 key=2 info=0
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] c=0x9962268 s=0x9965e98 comm_type=4

[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] Could not find info for ...opsec_client...
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] opsec_get_comm: Creating session hash (size=256)
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] opsec_get_comm: ADDING comm=0x996c000 to ent=0x9962268 with key=2
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] opsec_env_get_context_id_by_peer_sic_name: found context id=0 for peer sic name=cn=cp_mgmt,o=******
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] opsec_env_get_sic_handle_by_context_id: found sic handle (ctx id=0)
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] opsec_sic_connect: connecting... (ctx id=0)
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] peers addresses are
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] 192.168.10.5
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] cpsicdemux_get_mode: the mode is 1
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] cpsicdemux_check_mode: server_mode=1 | requested_mode=1
DEBUG: OPSEC session start handler was invoked
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] SESSION ID:3 is sending DG_TYPE=1

[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] pushing dgtype=1 len=0 to list=0x996c01c
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] SESSION ID:3 is sending DG_TYPE=402

[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] pushing dgtype=402 len=29 to list=0x996c01c
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] SESSION ID:3 is sending DG_TYPE=40c

[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] pushing dgtype=40c len=0 to list=0x996c01c
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] fwasync_connected: 14: getpeername: Transport endpoint is not connected
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] opsec_auth_client_connected: connection to server failed.
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] opsec_auth_client_connected:conn=(nil) opaque=0x9966088 err=0 comm=0x996c000
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] comm failed to connect 0x996c000
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] OPSEC_SET_ERRNO: err = 8 Comm is not connected/Unable to connect (pre = 8)
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] COM 0x996c000 got signal 131075
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] destroying comm 0x996c000
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] Destroying comm 0x996c000 with 1 active sessions
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] Destroying session (996cbe8) id 3 (ent=9962268) reason=COMM_IS_DEAD
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] SESSION ID:3 is sending DG_TYPE=3

DEBUG: OPSEC_SESSION_END_HANDLER called
ERROR: No communication.
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] opsec_comm_is_needed:comm 0x996c000 1/1 sessions need the comm.
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] pulling dgtype=1 len=0 to list=0x996c01c
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] pulling dgtype=402 len=29 to list=0x996c01c
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] pulling dgtype=40c len=0 to list=0x996c01c
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] pulling dgtype=ffffffff len=-1 to list=0x996c01c
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] REMOVING comm=0x996c000 from ent=0x9962268 with key=2
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] sic_client_connected: SIC error - Client could not connect to server
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] trigger_ctx_deletor: received UNRAISE_DEL_HANDLER handler event (env = 0x995ecd0)
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] fwasync_do_end_conn: 14: calling 0xf7631830 to free opaque 0x996c4d8
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] T_event_mainloop_e: T_event_mainloop_iter returns 0
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] Destroying entity 1 with 0 active comms
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] opsec_destroy_entity_sic: deleting sic rules for entity 0x9962268
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] Destroying entity 2 with 0 active comms
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] opsec_destroy_entity_sic: deleting sic rules for entity 0x9965e98
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] IpcUnMapFile: unmapping file (handle=0x9967cf0)
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] IpcUnMapFile: unmapping file (handle=0x9967df0)
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] IpcUnMapFile: unmapping file (handle=0x9967e80)
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] IpcUnMapFile: unmapping file (handle=0x9967f28)
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] IpcUnMapFile: unmapping file (handle=0x9967fb8)
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] PM_policy_destroy: finished successfully.
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] opsec_destroy_sic_id_internal: Destroyed sic id (ctx id=0)
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] opsec_env_destroy_sic_id_hash: Destroyed sic id hash
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] fwd_env_destroy: env 0x995ecd0 (alloced = 1)
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] T_env_destroy: env 0x995ecd0
[ 10749 4140537568]@opchs02cloudDN[18 Oct 15:13:41] do_fwd_env_destroy: really destroy 0x995ecd0
DEBUG: Close connection to screen.

@LeArmadillo
Copy link
Author

Upon further investigation it seems that it throws the error
opsec_read_cert_file: could not open file: /home/Bluesky/fw1-loggrabber/fw1-loggrabber/opsec.p12
even if the file is not present or named differently. I'm finding this quite bemusing as the file is definitely there and I've even placed it in different folders and changed the path. I've placed my lea.conf below for comparison. Am I missing something obvious or have I got a library clash that is preventing the file read?

lea.conf:
lea_server auth_type sslca
lea_server ip 62._._.126
lea_server auth_port 18184
lea_server port 18184
opsec_sic_name CN=LogGrabberOPSEC,O=******
opsec_sslca_file /home/Bluesky/fw1-loggrabber/fw1-loggrabber/opsec.p12
lea_server opsec_entity_sic_name cn=cp_mgmt,o=******

@adepasquale
Copy link
Contributor

First, please clarify, would you like to use SSL CA or SSL OPSEC?

https://github.com/certego/fw1-loggrabber/wiki/Configure-remote-Checkpoint-device

@LeArmadillo
Copy link
Author

LeArmadillo commented Oct 19, 2016

Hi @adepasquale

SSL CA, I was also trying SSL OPSEC to see if the output was different, however it is the same for both of them. I have verified that the opsec.p12 was generated correctly using opsec_pull_cert and the server has been configured as in the above guide for SSL CA. Still I'm getting the error:

"opsec_read_cert_file: could not open file: /home/Bluesky/fw1-loggrabber/fw1-loggrabber/opsec.p12"

The error gives back the file path quoted directly in lea.conf even if the opsec.p12 is not present hence the error is actually that the program can't find the file rather than can't open it.

full debug trace SSL CA:

[Bluesky@opchs02cloudDN fw1-loggrabber]$ ./fw1-loggrabber
WARNING: Illegal entry in configuration file: FW1_OUTPUT=logs"
WARNING: Illegal entry in configuration file: FW1_TYPE=ng"
WARNING: Illegal entry in configuration file: FW1_MODE=normal"
WARNING: Illegal entry in configuration file: ONLINE_MODE=yes"
WARNING: Illegal entry in configuration file: RESOLVE_MODE=no"
WARNING: Illegal entry in configuration file: SHOW_FIELDNAMES="yes"
WARNING: Illegal entry in configuration file: DATEFORMAT=std"
WARNING: Illegal entry in configuration file: LOGGING_CONFIGURATION=file
WARNING: Illegal entry in configuration file: OTPUT_FILE_PREFIX="fw1-loggrabber"
WARNING: Illegal entry in configuration file: SYSLOG_FACILITY=LOCAL1"
DEBUG: Open connection to screen.
DEBUG: Logfilename : fw.log"
DEBUG: Record Separator : |
DEBUG: Resolve Addresses: Yes
DEBUG: Show Filenames : No
DEBUG: FW1-2000 : No
DEBUG: Online-Mode : No
DEBUG: Audit-Log : No
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:27] Env Configuration:
""
:type (opsec_info)
:lea_server (
:auth_type (
") :sslca ("
)
") :ip ("62._._.126
") :auth_port ("18184
") :port ("18184
") :opsec_entity_sic_name ("cn=cp_mgmt,o=******
)
")opsec_sic_name ("CN=LogGrabberOPSEC,O=******
")opsec_sslca_file ("/home/Bluesky/fw1-loggrabber/fw1-loggrabber/opsec.p12
)

[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:27] opsec_initdir: opsec dir already initialized to: /tmp
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:27] Could not find info for ...opsec_shared_local_path...
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:27] Could not find info for ...opsec_sic_policy_file...
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:27] Could not find info for ...opsec_mt...
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:27] opsec_init: multithread safety is not initialized
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:27] cpprng_opsec_initialize: path is not initialized - will initialize
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:27] cpprng_opsec_initialize: full file name is /tmp/ops_prng

[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] create_rand_mutex: failed to create mutex: Operation not permitted
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] fwrand_write_seed: Failed to create mutex.: Permission denied
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] create_rand_mutex: failed to create mutex: Operation not permitted
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] cpprng_opsec_set_initialized: Failed to create mutex.: Permission denied
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] cpprng_opsec_initialize: dev_urandom_poll returned -1
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] create_rand_mutex: failed to create mutex: Operation not permitted
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] cpprng_opsec_is_initialized: Failed to create mutex.: Permission denied
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] cpprng_opsec_initialize: seed init for opsec failed but file was created
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] opsec_init_sic: failed to initialize seed. Seed will be initialized later.
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] PM_policy_create: version 5301.
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] PM_policy_add_name_to_group: finished successfully.
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] PM_policy_set_local_names: () names. finished successfully.
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] PM_policy_create: finished successfully.
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] PM_policy_add_name_to_group: finished successfully.
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] PM_policy_set_local_names: (local_sic_name) names. finished successfully.
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] PM_policy_add_name_to_group: finished successfully.
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] PM_policy_set_local_names: (127.0.0.1) names. finished successfully.
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] PM_policy_add_name_to_group: finished successfully.
") names. finished successfully.DN[19 Oct 9:42:51] PM_policy_set_local_names: ("CN=LogGrabberOPSEC,O=******
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] get_bc_ds_choiceID: Failed to open registry; using default
].13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] PM_apply_default_dn: ca_dn = [O=******
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] PM_apply_default_dn: calling PM_policy_DN_conversion ..
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] PM_apply_default_dn: finished successfully.
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] opsec_read_cert_file: could not open file: /home/Bluesky/fw1-loggrabber/fw1-loggrabber/opsec.p12
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] opsec_init_sslca: failed to read cert file
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] opsec_init_sic_id_internal: Added sic id (ctx id = 0)
DEBUG: OPSEC LEA conf file is /home/Bluesky/fw1-loggrabber/fw1-loggrabber/lea.conf
DEBUG: Clear text mode has been used.
DEBUG: Server-IP : 62._._.126
DEBUG: Server-Port : 18184
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] opsec_init_entity_sic: called for the client side
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] Configuring entity lea_server
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] Could not find info for ...conn_buf_size...
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] Could not find info for ...no_nagle...
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] opsec_init_entity_sic: Authentication not initialized...
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] opsec_init_entity_sic: adding default rule
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] opsec_entity_add_sic_rule: adding rules: apply_to: ANY, peer: ANY, d_ip: ANY, dport ANY, svc: lea, method: fwn1
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] opsec_entity_add_sic_rule: adding INBOUND rule
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] opsec_entity_add_sic_rule: adding OUTBOUND (IP) rule
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] opsec_entity_add_sic_rule: adding OUTBOUND rule
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] opsec_get_comm: creating comm for ent=8abaf80 peer=8abc0f0 passive=0 key=2 info=0
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] c=0x8abaf80 s=0x8abc0f0 comm_type=4

[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] Could not find info for ...opsec_client...
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] opsec_get_comm: Creating session hash (size=256)
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] opsec_get_comm: ADDING comm=0x8abe658 to ent=0x8abaf80 with key=2
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] opsec_env_get_context_id_by_peer_sic_name: found context id=0 for peer sic name=cn=cp_mgmt,o=******
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] opsec_env_get_sic_handle_by_context_id: found sic handle (ctx id=0)
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] opsec_sic_connect: connecting... (ctx id=0)
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] fw_do_get_all_ipaddrs: called. naddrs=32769

[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] resolver_getaddrinfo_list: name=opchs02cloudDN, pref=0
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] resolver_getaddrinfo_list: found peer 0 192.168.10.5
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] fw_do_get_all_ipaddrs: fw_ipaddr_both returned 192.168.10.5 ::

[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] fw_do_get_all_ipaddrs: found 0 addresses

[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] sic_init_myaddr_ex: could not get my own IPv6 addresses.
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] resolver_gethostbyname: Performing gethostbyname for opchs02cloudDN
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] peers addresses are
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] 192.168.10.5
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] cpsicdemux_get_mode: the mode is 1
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] cpsicdemux_check_mode: server_mode=1 | requested_mode=1
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] fwasync_get_maxbuf: maxbuf=4194304
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] T_event_epoll_report: EPOLL API disabled; SELECT is used
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] SESSION ID:3 is sending DG_TYPE=1

[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] pushing dgtype=1 len=0 to list=0x8abe674
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] SESSION ID:3 is sending DG_TYPE=402

[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] pushing dgtype=402 len=27 to list=0x8abe674
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] fwasync_connected: 10: getpeername: Transport endpoint is not connected
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] opsec_auth_client_connected: connection to server failed.
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] opsec_auth_client_connected:conn=(nil) opaque=0x8abd2e8 err=0 comm=0x8abe658
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] comm failed to connect 0x8abe658
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] OPSEC_SET_ERRNO: err = 8 Comm is not connected/Unable to connect (pre = 0)
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] COM 0x8abe658 got signal 131075
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] destroying comm 0x8abe658
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] Destroying comm 0x8abe658 with 1 active sessions
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] Destroying session (8ac1690) id 3 (ent=8abaf80) reason=COMM_IS_DEAD
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] SESSION ID:3 is sending DG_TYPE=3

DEBUG: OPSEC_SESSION_END_HANDLER called
ERROR: No communication.
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] opsec_del_event : event ctx is not activated

[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] opsec_del_event : event ctx is not activated

[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] opsec_del_event : event ctx is not activated

[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] opsec_comm_is_needed:comm 0x8abe658 1/1 sessions need the comm.
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] pulling dgtype=1 len=0 to list=0x8abe674
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] pulling dgtype=402 len=27 to list=0x8abe674
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] pulling dgtype=ffffffff len=-1 to list=0x8abe674
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] REMOVING comm=0x8abe658 from ent=0x8abaf80 with key=2
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] sic_client_connected: SIC error - Client could not connect to server
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] fwasync_do_end_conn: 10: calling 0xf7655830 to free opaque 0x8abeb30
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] T_event_mainloop_e: T_event_mainloop_iter returns 0
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] Destroying entity 1 with 0 active comms
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] opsec_destroy_entity_sic: deleting sic rules for entity 0x8abaf80
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] Destroying entity 2 with 0 active comms
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] opsec_destroy_entity_sic: deleting sic rules for entity 0x8abc0f0
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] IpcUnMapFile: unmapping file (handle=0x8aba5f8)
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] IpcUnMapFile: unmapping file (handle=0x8abad78)
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] IpcUnMapFile: unmapping file (handle=0x8abadf8)
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] IpcUnMapFile: unmapping file (handle=0x8abaea0)
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] IpcUnMapFile: unmapping file (handle=0x8abaf30)
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] PM_policy_destroy: finished successfully.
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] opsec_destroy_sic_id_internal: Destroyed sic id (ctx id=0)
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] opsec_env_destroy_sic_id_hash: Destroyed sic id hash
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] fwd_env_destroy: env 0x8ab5ce0 (alloced = 1)
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] T_env_destroy: env 0x8ab5ce0
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] do_fwd_env_destroy: really destroy 0x8ab5ce0
DEBUG: Processing Logfile: fw.log"
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] Env Configuration:
""
:type (opsec_info)
:lea_server (
:auth_type (
") :sslca ("
)
") :ip ("62._._.126
") :auth_port ("18184
") :port ("18184
") :opsec_entity_sic_name ("cn=cp_mgmt,o=******
)
")opsec_sic_name ("CN=LogGrabberOPSEC,O=******
")opsec_sslca_file ("/home/Bluesky/fw1-loggrabber/fw1-loggrabber/opsec.p12
)

[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] opsec_initdir: opsec dir already initialized to: /tmp
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] Could not find info for ...opsec_shared_local_path...
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] Could not find info for ...opsec_sic_policy_file...
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] Could not find info for ...opsec_mt...
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:42:51] opsec_init: multithread safety is not initialized
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] create_rand_mutex: failed to create mutex: Operation not permitted
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] fwrand_write_seed: Failed to create mutex.: Permission denied
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] create_rand_mutex: failed to create mutex: Operation not permitted
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] cpprng_opsec_set_initialized: Failed to create mutex.: Permission denied
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] cpprng_opsec_initialize: dev_urandom_poll returned -1
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] create_rand_mutex: failed to create mutex: Operation not permitted
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] cpprng_opsec_is_initialized: Failed to create mutex.: Permission denied
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] cpprng_opsec_initialize: seed init for opsec failed but file was created
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] opsec_init_sic: failed to initialize seed. Seed will be initialized later.
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] PM_policy_create: version 5301.
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] PM_policy_add_name_to_group: finished successfully.
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] PM_policy_set_local_names: () names. finished successfully.
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] PM_policy_create: finished successfully.
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] PM_policy_add_name_to_group: finished successfully.
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] PM_policy_set_local_names: (local_sic_name) names. finished successfully.
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] PM_policy_add_name_to_group: finished successfully.
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] PM_policy_set_local_names: (127.0.0.1) names. finished successfully.
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] PM_policy_add_name_to_group: finished successfully.
") names. finished successfully.DN[19 Oct 9:43:45] PM_policy_set_local_names: ("CN=LogGrabberOPSEC,O=******
].13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] PM_apply_default_dn: ca_dn = [O=******
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] PM_apply_default_dn: calling PM_policy_DN_conversion ..
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] PM_apply_default_dn: finished successfully.
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] opsec_read_cert_file: could not open file: /home/Bluesky/fw1-loggrabber/fw1-loggrabber/opsec.p12
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] opsec_init_sslca: failed to read cert file
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] opsec_init_sic_id_internal: Added sic id (ctx id = 0)
DEBUG: OPSEC LEA conf file is /home/Bluesky/fw1-loggrabber/fw1-loggrabber/lea.conf
DEBUG: Clear text mode has been used.
DEBUG: Server-IP : 62._._.126
DEBUG: Server-Port : 18184
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] opsec_init_entity_sic: called for the client side
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] Configuring entity lea_server
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] Could not find info for ...conn_buf_size...
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] Could not find info for ...no_nagle...
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] opsec_init_entity_sic: Authentication not initialized...
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] opsec_init_entity_sic: adding default rule
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] opsec_entity_add_sic_rule: adding rules: apply_to: ANY, peer: ANY, d_ip: ANY, dport ANY, svc: lea, method: fwn1
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] opsec_entity_add_sic_rule: adding INBOUND rule
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] opsec_entity_add_sic_rule: adding OUTBOUND (IP) rule
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] opsec_entity_add_sic_rule: adding OUTBOUND rule
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] opsec_get_comm: creating comm for ent=8ab91e8 peer=8abcd98 passive=0 key=2 info=0
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] c=0x8ab91e8 s=0x8abcd98 comm_type=4

[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] Could not find info for ...opsec_client...
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] opsec_get_comm: Creating session hash (size=256)
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] opsec_get_comm: ADDING comm=0x8ac2f28 to ent=0x8ab91e8 with key=2
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] opsec_env_get_context_id_by_peer_sic_name: found context id=0 for peer sic name=cn=cp_mgmt,o=******
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] opsec_env_get_sic_handle_by_context_id: found sic handle (ctx id=0)
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] opsec_sic_connect: connecting... (ctx id=0)
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] peers addresses are
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] 192.168.10.5
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] cpsicdemux_get_mode: the mode is 1
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] cpsicdemux_check_mode: server_mode=1 | requested_mode=1
DEBUG: OPSEC session start handler was invoked
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] SESSION ID:3 is sending DG_TYPE=1

[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] pushing dgtype=1 len=0 to list=0x8ac2f44
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] SESSION ID:3 is sending DG_TYPE=402

[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] pushing dgtype=402 len=29 to list=0x8ac2f44
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] SESSION ID:3 is sending DG_TYPE=40c

[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] pushing dgtype=40c len=0 to list=0x8ac2f44
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] fwasync_connected: 12: getpeername: Transport endpoint is not connected
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] opsec_auth_client_connected: connection to server failed.
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] opsec_auth_client_connected:conn=(nil) opaque=0x8abcf88 err=0 comm=0x8ac2f28
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] comm failed to connect 0x8ac2f28
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] OPSEC_SET_ERRNO: err = 8 Comm is not connected/Unable to connect (pre = 8)
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] COM 0x8ac2f28 got signal 131075
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] destroying comm 0x8ac2f28
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] Destroying comm 0x8ac2f28 with 1 active sessions
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] Destroying session (8ac3b10) id 3 (ent=8ab91e8) reason=COMM_IS_DEAD
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] SESSION ID:3 is sending DG_TYPE=3

DEBUG: OPSEC_SESSION_END_HANDLER called
ERROR: No communication.
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] opsec_comm_is_needed:comm 0x8ac2f28 1/1 sessions need the comm.
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] pulling dgtype=1 len=0 to list=0x8ac2f44
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] pulling dgtype=402 len=29 to list=0x8ac2f44
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] pulling dgtype=40c len=0 to list=0x8ac2f44
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] pulling dgtype=ffffffff len=-1 to list=0x8ac2f44
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] REMOVING comm=0x8ac2f28 from ent=0x8ab91e8 with key=2
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] sic_client_connected: SIC error - Client could not connect to server
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] fwasync_do_end_conn: 12: calling 0xf7655830 to free opaque 0x8ac3400
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] trigger_ctx_deletor: received UNRAISE_DEL_HANDLER handler event (env = 0x8ab5c58)
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] T_event_mainloop_e: T_event_mainloop_iter returns 0
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] Destroying entity 1 with 0 active comms
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] opsec_destroy_entity_sic: deleting sic rules for entity 0x8ab91e8
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] Destroying entity 2 with 0 active comms
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] opsec_destroy_entity_sic: deleting sic rules for entity 0x8abcd98
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] IpcUnMapFile: unmapping file (handle=0x8ab4070)
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] IpcUnMapFile: unmapping file (handle=0x8abef20)
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] IpcUnMapFile: unmapping file (handle=0x8abef60)
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] IpcUnMapFile: unmapping file (handle=0x8abed58)
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] IpcUnMapFile: unmapping file (handle=0x8abede8)
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] PM_policy_destroy: finished successfully.
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] opsec_destroy_sic_id_internal: Destroyed sic id (ctx id=0)
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] opsec_env_destroy_sic_id_hash: Destroyed sic id hash
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] fwd_env_destroy: env 0x8ab5c58 (alloced = 1)
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] T_env_destroy: env 0x8ab5c58
[ 13484 4140685024]@opchs02cloudDN[19 Oct 9:43:45] do_fwd_env_destroy: really destroy 0x8ab5c58
DEBUG: Close connection to screen.
[Bluesky@opchs02cloudDN fw1-loggrabber]$
[Bluesky@opchs02cloudDN fw1-loggrabber]$

@adepasquale
Copy link
Contributor

Although I advise against doing this in production, you could try:

  • moving the opsec.p12 file to /tmp/opsec.p12;
  • running chmod a+r /tmp/opsec.p12;
  • replacing the path inside lea.conf;
  • running fw1-loggrabber again.

@LeArmadillo
Copy link
Author

I made the above changes and still have the same error result:

opsec_read_cert_file: could not open file: /tmp/opsec.p12

lea.conf
lea_server auth_type sslca
lea_server ip 62._._.126
lea_server auth_port 18184
lea_server port 18184
opsec_sic_name CN=LogGrabberOPSEC,O=******
opsec_sslca_file /tmp/opsec.p12
lea_server opsec_entity_sic_name cn=cp_mgmt,o=******

@adepasquale
Copy link
Contributor

Weird error, never seen before. Could you please try again with this lea.conf file? Note the quotes and the missing port line.

lea_server auth_type sslca
lea_server ip 62...126
lea_server auth_port 18184
opsec_sic_name "CN=LogGrabberOPSEC,O=******"
opsec_sslca_file /tmp/opsec.p12
lea_server opsec_entity_sic_name "cn=cp_mgmt,o=******"

@LeArmadillo
Copy link
Author

Tried again, putting back in the quotes brought it back a step to the
"ERROR: failed to initialize client/server-pair (NO Error)" error statement still with
"opsec_read_cert_file: could not open file: /tmp/opsec.p12"
as the underlying cause, full debug trace below

lea.conf:
lea_server auth_type sslca
lea_server ip 62...126
lea_server auth_port 18184
opsec_sic_name "CN=LogGrabberOPSEC,O=_"
opsec_sslca_file /tmp/opsec.p12
lea_server opsec_entity_sic_name "cn=cp_mgmt,o=_
"

full debug:
[root@opchs02cloudDN fw1-loggrabber]# ./fw1-loggrabber
WARNING: Illegal entry in configuration file: FW1_OUTPUT=logs"
WARNING: Illegal entry in configuration file: FW1_TYPE=ng"
WARNING: Illegal entry in configuration file: FW1_MODE=normal"
WARNING: Illegal entry in configuration file: ONLINE_MODE=yes"
WARNING: Illegal entry in configuration file: RESOLVE_MODE=no"
WARNING: Illegal entry in configuration file: SHOW_FIELDNAMES="yes"
WARNING: Illegal entry in configuration file: DATEFORMAT=std"
WARNING: Illegal entry in configuration file: LOGGING_CONFIGURATION=file
WARNING: Illegal entry in configuration file: OTPUT_FILE_PREFIX="fw1-loggrabber"
WARNING: Illegal entry in configuration file: SYSLOG_FACILITY=LOCAL1"
DEBUG: Open connection to screen.
DEBUG: Logfilename : fw.log"
DEBUG: Record Separator : |
DEBUG: Resolve Addresses: Yes
DEBUG: Show Filenames : No
DEBUG: FW1-2000 : No
DEBUG: Online-Mode : No
DEBUG: Audit-Log : No
[ 19257 4140697312]@opchs02cloudDN[19 Oct 10:46:28] Env Configuration:
""
:type (opsec_info)
:lea_server (
") :auth_type ("sslca
") :ip ("62...126
") :auth_port ("18184
") :opsec_entity_sic_name ("'cn=cp_mgmt,o=_'
)
")opsec_sic_name ("'CN=LogGrabberOPSEC,O=_
'
")opsec_sslca_file ("/tmp/opsec.p12
)

[ 19257 4140697312]@opchs02cloudDN[19 Oct 10:46:28] opsec_initdir: opsec dir already initialized to: /tmp
[ 19257 4140697312]@opchs02cloudDN[19 Oct 10:46:28] Could not find info for ...opsec_shared_local_path...
[ 19257 4140697312]@opchs02cloudDN[19 Oct 10:46:28] Could not find info for ...opsec_sic_policy_file...
[ 19257 4140697312]@opchs02cloudDN[19 Oct 10:46:28] Could not find info for ...opsec_mt...
[ 19257 4140697312]@opchs02cloudDN[19 Oct 10:46:28] opsec_init: multithread safety is not initialized
[ 19257 4140697312]@opchs02cloudDN[19 Oct 10:46:28] cpprng_opsec_initialize: path is not initialized - will initialize
[ 19257 4140697312]@opchs02cloudDN[19 Oct 10:46:28] cpprng_opsec_initialize: full file name is /tmp/ops_prng

[ 19257 4140697312]@opchs02cloudDN[19 Oct 10:46:30] cpprng_opsec_initialize: dev_urandom_poll returned 0
[ 19257 4140697312]@opchs02cloudDN[19 Oct 10:46:30] opsec_file_is_intialized: seed is initialized
[ 19257 4140697312]@opchs02cloudDN[19 Oct 10:46:30] cpprng_opsec_initialize: seed init for opsec succeeded
[ 19257 4140697312]@opchs02cloudDN[19 Oct 10:46:30] PM_policy_create: version 5301.
[ 19257 4140697312]@opchs02cloudDN[19 Oct 10:46:30] PM_policy_add_name_to_group: finished successfully.
[ 19257 4140697312]@opchs02cloudDN[19 Oct 10:46:30] PM_policy_set_local_names: () names. finished successfully.
[ 19257 4140697312]@opchs02cloudDN[19 Oct 10:46:30] PM_policy_create: finished successfully.
[ 19257 4140697312]@opchs02cloudDN[19 Oct 10:46:30] PM_policy_add_name_to_group: finished successfully.
[ 19257 4140697312]@opchs02cloudDN[19 Oct 10:46:30] PM_policy_set_local_names: (local_sic_name) names. finished successfully.
[ 19257 4140697312]@opchs02cloudDN[19 Oct 10:46:30] PM_policy_add_name_to_group: finished successfully.
[ 19257 4140697312]@opchs02cloudDN[19 Oct 10:46:30] PM_policy_set_local_names: (127.0.0.1) names. finished successfully.
[ 19257 4140697312]@opchs02cloudDN[19 Oct 10:46:30] PM_policy_add_name_to_group: finished successfully.
") names. finished successfully.DN[19 Oct 10:46:30] PM_policy_set_local_names: (""CN=LogGrabberOPSEC,O="
[ 19257 4140697312]@opchs02cloudDN[19 Oct 10:46:30] PM_apply_default_dn: finished successfully.
[ 19257 4140697312]@opchs02cloudDN[19 Oct 10:46:30] opsec_read_cert_file: could not open file: /tmp/opsec.p12
[ 19257 4140697312]@opchs02cloudDN[19 Oct 10:46:30] opsec_init_sslca: failed to read cert file
[ 19257 4140697312]@opchs02cloudDN[19 Oct 10:46:30] opsec_init_sic_id_internal: Added sic id (ctx id = 0)
DEBUG: OPSEC LEA conf file is /home/Bluesky/fw1-loggrabber/fw1-loggrabber/lea.conf
DEBUG: Authentication mode has been used.
DEBUG: Server-IP : 62.
.*.126
DEBUG: Server-Port : 18184
DEBUG: Authentication type: sslca
DEBUG: OPSEC sic certificate file name : /tmp/opsec.p12
DEBUG: Server DN (sic name) : "cn=cp_mgmt,o=
"
DEBUG: OPSEC LEA client DN (sic name) : "CN=LogGrabberOPSEC,O=**
***"
[ 19257 4140697312]@opchs02cloudDN[19 Oct 10:46:30] opsec_init_entity_sic: called for the client side
[ 19257 4140697312]@opchs02cloudDN[19 Oct 10:46:30] Configuring entity lea_server
[ 19257 4140697312]@opchs02cloudDN[19 Oct 10:46:30] Could not find info for ...conn_buf_size...
[ 19257 4140697312]@opchs02cloudDN[19 Oct 10:46:30] Could not find info for ...no_nagle...
[ 19257 4140697312]@opchs02cloudDN[19 Oct 10:46:30] Could not find info for ...port...
[ 19257 4140697312]@opchs02cloudDN[19 Oct 10:46:30] opsec_config_entity: unknown auth type
[ 19257 4140697312]@opchs02cloudDN[19 Oct 10:46:30] Destroying entity 2 with 0 active comms
[ 19257 4140697312]@opchs02cloudDN[19 Oct 10:46:30] opsec_destroy_entity_sic: deleting sic rules for entity 0x9b210c8
ERROR: failed to initialize client/server-pair (NO Error)
[ 19257 4140697312]@opchs02cloudDN[19 Oct 10:46:30] Destroying entity 1 with 0 active comms
[ 19257 4140697312]@opchs02cloudDN[19 Oct 10:46:30] opsec_destroy_entity_sic: deleting sic rules for entity 0x9b1ff58
[ 19257 4140697312]@opchs02cloudDN[19 Oct 10:46:30] IpcUnMapFile: unmapping file (handle=0x9b1f578)
[ 19257 4140697312]@opchs02cloudDN[19 Oct 10:46:30] IpcUnMapFile: unmapping file (handle=0x9b1fd50)
[ 19257 4140697312]@opchs02cloudDN[19 Oct 10:46:30] IpcUnMapFile: unmapping file (handle=0x9b1fdd0)
[ 19257 4140697312]@opchs02cloudDN[19 Oct 10:46:30] IpcUnMapFile: unmapping file (handle=0x9b1fe78)
[ 19257 4140697312]@opchs02cloudDN[19 Oct 10:46:30] IpcUnMapFile: unmapping file (handle=0x9b1ff08)
[ 19257 4140697312]@opchs02cloudDN[19 Oct 10:46:30] PM_policy_destroy: finished successfully.
[ 19257 4140697312]@opchs02cloudDN[19 Oct 10:46:30] opsec_destroy_sic_id_internal: Destroyed sic id (ctx id=0)
[ 19257 4140697312]@opchs02cloudDN[19 Oct 10:46:30] opsec_env_destroy_sic_id_hash: Destroyed sic id hash
[ 19257 4140697312]@opchs02cloudDN[19 Oct 10:46:30] fwd_env_destroy: env 0x9b1acc0 (alloced = 1)
[ 19257 4140697312]@opchs02cloudDN[19 Oct 10:46:30] T_env_destroy: env 0x9b1acc0
[ 19257 4140697312]@opchs02cloudDN[19 Oct 10:46:30] do_fwd_env_destroy: really destroy 0x9b1acc0

@LeArmadillo
Copy link
Author

Would you be able to point out where the error is created in the code? opsec_read_cert_file is not in the code and produces no results on Google so it's very hard to debug.

opsec_read_cert_file: could not open file: /home/Bluesky/fw1-loggrabber/fw1-loggrabber/opsec.p12

@adepasquale
Copy link
Contributor

opsec_read_cert_file is part of the Checkpoint OPSEC libraries; at the bottom of this page there is some documentation.

The highlighted parts correspond to your code path, where the LEA client/server pair is initialized:

{
//NG
fw1_port =
opsec_get_conf (pEnv, "lea_server", "auth_port", NULL);
opsec_certificate =
opsec_get_conf (pEnv, "opsec_sslca_file", NULL);
opsec_client_dn =
opsec_get_conf (pEnv, "opsec_sic_name", NULL);
opsec_server_dn =
opsec_get_conf (pEnv, "lea_server",
"opsec_entity_sic_name", NULL);
if ((fw1_port == NULL) || (opsec_certificate == NULL)
|| (opsec_client_dn == NULL)
|| (opsec_server_dn == NULL))
{
fprintf (stderr,
"ERROR: The parameters about authentication mode have not been set.\n");
exit_loggrabber (1);
}
else
{
fprintf (stderr,
"DEBUG: Authentication mode has been used.\n");
fprintf (stderr, "DEBUG: Server-IP : %s\n",
fw1_server);
fprintf (stderr, "DEBUG: Server-Port : %s\n",
fw1_port);
fprintf (stderr, "DEBUG: Authentication type: %s\n",
auth_type);
fprintf (stderr,
"DEBUG: OPSEC sic certificate file name : %s\n",
opsec_certificate);
fprintf (stderr, "DEBUG: Server DN (sic name) : %s\n",
opsec_server_dn);
fprintf (stderr,
"DEBUG: OPSEC LEA client DN (sic name) : %s\n",
opsec_client_dn);
} //end of inner if
}

/*
* initialize opsec-client
*/
pClient = opsec_init_entity (pEnv, LEA_CLIENT,
LEA_RECORD_HANDLER,
read_fw1_logfile_record,
LEA_DICT_HANDLER, read_fw1_logfile_dict,
LEA_EOF_HANDLER, read_fw1_logfile_eof,
LEA_SWITCH_HANDLER,
read_fw1_logfile_switch,
LEA_FILTER_QUERY_ACK,
((cfgvalues.
audit_mode) ? ((cfgvalues.
audit_filter_count >
0) ?
read_fw1_logfile_queryack
: NULL) : ((cfgvalues.
fw1_filter_count
>
0) ?
read_fw1_logfile_queryack
: NULL)),
LEA_COL_LOGS_HANDLER,
read_fw1_logfile_collogs,
LEA_SUSPEND_HANDLER,
read_fw1_logfile_suspend,
LEA_RESUME_HANDLER,
read_fw1_logfile_resume,
OPSEC_SESSION_START_HANDLER,
read_fw1_logfile_start,
OPSEC_SESSION_END_HANDLER,
read_fw1_logfile_end,
OPSEC_SESSION_ESTABLISHED_HANDLER,
read_fw1_logfile_established, OPSEC_EOL);
/*
* initialize opsec-server for authenticated and unauthenticated connections
*/
pServer =
opsec_init_entity (pEnv, LEA_SERVER, OPSEC_ENTITY_NAME, "lea_server",
OPSEC_EOL);
/*
* continue only if opsec initializations were successful
*/
if ((!pClient) || (!pServer))
{
fprintf (stderr,
"ERROR: failed to initialize client/server-pair (%s)\n",
opsec_errno_str (opsec_errno));
cleanup_fw1_environment (pEnv, pClient, pServer);
exit_loggrabber (1);
}

This code snippet

          fprintf (stderr,
                   "ERROR: failed to initialize client/server-pair (%s)\n",
                   opsec_errno_str (opsec_errno));

is what in your debug output becomes:

ERROR: failed to initialize client/server-pair (NO Error)

Unfortunately I've never tested fw1-loggrabber with a CLM, so the last thing I might suggest trying is to test the unauthenticated connection mode.

Anyway, even if that works, I would NOT recommend to use unauthenticated mode in production.

@asafbart
Copy link

I have this issue also, does anyone have a solution?

@asafbart
Copy link

asafbart commented Sep 20, 2017

This happened to me because on the management side the file "$FWDIR/conf/fwopsec.conf " was changed,
From

       # lea_server auth_port 18184
       # lea_server port 0 

To

        lea_server auth_port 0
        lea_server port 18184

because they apperantly configured it as "ssl_opsec" not "sslca", see description here: https://supportcenter.checkpoint.com/supportcenter/portal?eventSubmit_doGoviewsolutiondetails=&solutionid=sk32521

so in the lea configuration just change it according to the instructions in the loggrabber "ssl_opsec" reference.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants