Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Incompatible with Cloudflare Tunnel ? Possible Bug #48

Open
aadityaverma opened this issue Mar 12, 2024 · 7 comments
Open

Incompatible with Cloudflare Tunnel ? Possible Bug #48

aadityaverma opened this issue Mar 12, 2024 · 7 comments

Comments

@aadityaverma
Copy link

I am using cloudflare tunnel to route, on the ios app responses get cut short while on the macos app no response is printed...

What info may I provide to better help with diagnosis and resolution of this bug.

@aadityaverma aadityaverma changed the title Incomparible with Cloudflare Tunnel ? Possible Bug Incompatible with Cloudflare Tunnel ? Possible Bug Mar 12, 2024
@AugustDev
Copy link
Owner

Hi, could you please double check if you're using the latest version of the iOS app? If you give more details of how you use Cloudflare tunnel I could try reproducing it.

@aadityaverma
Copy link
Author

Thankyou for the prompt reply, its a great initiative, kuddos !
Yep its the latest ios app, gotta check the macos version.
Ollama is running on Ubuntu 20.04 LTS and cloudflare is configure following instructions here .. cloudflare is also the registrar of the domain..
https://developers.cloudflare.com/cloudflare-one/connections/connect-networks/get-started/create-local-tunnel/

@aadityaverma
Copy link
Author

aadityaverma commented Mar 12, 2024

Just updated the macos app, before it wasnt showing any responses but now like the ios version it is showing partial replies.. hope this helps pinpoint

@aadityaverma
Copy link
Author

tried the new ios update, still the same issue.. image

@Torreyc
Copy link

Torreyc commented Apr 29, 2024

I am having the same issue when using CloudFlare with Ollama on MacOS. Ngrok works fine but CloudFlare produces short results, this doesn’t make any sense.

#88

@x1d0
Copy link

x1d0 commented May 3, 2024

Same problem here

@pressdarling
Copy link

In case this is useful for anybody, here are some logs and a screenshot. I don't know enough about what I'm doing here to debug more fully.

No system prompt, using vanilla ollama phi3:latest. Details in logs. Here's a screenshot:

1 out of 10 is ok I guess
ollama server log

[GIN] 2024/05/07 - 13:08:26 | 200 |      17.084µs | 2403:5804:300:0:5c96:5280:db34:b3d5 | HEAD     "/"
[GIN] 2024/05/07 - 13:08:27 | 200 |       9.042µs | 2403:5804:300:0:5c96:5280:db34:b3d5 | HEAD     "/"
time=2024-05-07T13:08:27.266+10:00 level=INFO source=memory.go:152 msg="offload to gpu" layers.real=-1 layers.estimate=33 memory.available="10922.7 MiB" memory.required.full="3593.9 MiB" memory.required.partial="3593.9 MiB" memory.required.kv="768.0 MiB" memory.weights.total="2157.9 MiB" memory.weights.repeating="2080.9 MiB" memory.weights.nonrepeating="77.1 MiB" memory.graph.full="156.0 MiB" memory.graph.partial="156.0 MiB"
time=2024-05-07T13:08:27.266+10:00 level=INFO source=memory.go:152 msg="offload to gpu" layers.real=-1 layers.estimate=33 memory.available="10922.7 MiB" memory.required.full="3593.9 MiB" memory.required.partial="3593.9 MiB" memory.required.kv="768.0 MiB" memory.weights.total="2157.9 MiB" memory.weights.repeating="2080.9 MiB" memory.weights.nonrepeating="77.1 MiB" memory.graph.full="156.0 MiB" memory.graph.partial="156.0 MiB"
time=2024-05-07T13:08:27.268+10:00 level=INFO source=server.go:289 msg="starting llama server" cmd="/var/folders/dj/k6n5rdcd3zzbl9vr_qg_91b40000gn/T/ollama1822296248/runners/metal/ollama_llama_server --model /Users/notmyrealusername/.ollama/models/blobs/sha256-4fed7364ee3e0c7cb4fe0880148bfdfcd1b630981efa0802a6b62ee52e7da97e --ctx-size 2048 --batch-size 512 --embedding --log-disable --n-gpu-layers 33 --parallel 1 --port 59799"
time=2024-05-07T13:08:27.270+10:00 level=INFO source=sched.go:340 msg="loaded runners" count=1
time=2024-05-07T13:08:27.270+10:00 level=INFO source=server.go:432 msg="waiting for llama runner to start responding"
{"function":"server_params_parse","level":"INFO","line":2606,"msg":"logging to file is disabled.","tid":"0x1f74a3ac0","timestamp":1715051307}
{"build":2770,"commit":"952d03d","function":"main","level":"INFO","line":2823,"msg":"build info","tid":"0x1f74a3ac0","timestamp":1715051307}
{"function":"main","level":"INFO","line":2830,"msg":"system info","n_threads":8,"n_threads_batch":-1,"system_info":"AVX = 0 | AVX_VNNI = 0 | AVX2 = 0 | AVX512 = 0 | AVX512_VBMI = 0 | AVX512_VNNI = 0 | FMA = 0 | NEON = 1 | ARM_FMA = 1 | F16C = 0 | FP16_VA = 1 | WASM_SIMD = 0 | BLAS = 1 | SSE3 = 0 | SSSE3 = 0 | VSX = 0 | MATMUL_INT8 = 0 | LLAMAFILE = 1 | ","tid":"0x1f74a3ac0","timestamp":1715051307,"total_threads":10}
llama_model_loader: loaded meta data with 25 key-value pairs and 291 tensors from /Users/notmyrealusername/.ollama/models/blobs/sha256-4fed7364ee3e0c7cb4fe0880148bfdfcd1b630981efa0802a6b62ee52e7da97e (version GGUF V3 (latest))
llama_model_loader: Dumping metadata keys/values. Note: KV overrides do not apply in this output.
llama_model_loader: - kv   0:                       general.architecture str              = llama
llama_model_loader: - kv   1:                               general.name str              = LLaMA v2
llama_model_loader: - kv   2:                           llama.vocab_size u32              = 32064
llama_model_loader: - kv   3:                       llama.context_length u32              = 4096
llama_model_loader: - kv   4:                     llama.embedding_length u32              = 3072
llama_model_loader: - kv   5:                          llama.block_count u32              = 32
llama_model_loader: - kv   6:                  llama.feed_forward_length u32              = 8192
llama_model_loader: - kv   7:                 llama.rope.dimension_count u32              = 96
llama_model_loader: - kv   8:                 llama.attention.head_count u32              = 32
llama_model_loader: - kv   9:              llama.attention.head_count_kv u32              = 32
llama_model_loader: - kv  10:     llama.attention.layer_norm_rms_epsilon f32              = 0.000010
llama_model_loader: - kv  11:                       llama.rope.freq_base f32              = 10000.000000
llama_model_loader: - kv  12:                          general.file_type u32              = 15
llama_model_loader: - kv  13:                       tokenizer.ggml.model str              = llama
llama_model_loader: - kv  14:                      tokenizer.ggml.tokens arr[str,32064]   = ["<unk>", "<s>", "</s>", "<0x00>", "<...
llama_model_loader: - kv  15:                      tokenizer.ggml.scores arr[f32,32064]   = [0.000000, 0.000000, 0.000000, 0.0000...
llama_model_loader: - kv  16:                  tokenizer.ggml.token_type arr[i32,32064]   = [2, 3, 3, 6, 6, 6, 6, 6, 6, 6, 6, 6, ...
llama_model_loader: - kv  17:                tokenizer.ggml.bos_token_id u32              = 1
llama_model_loader: - kv  18:                tokenizer.ggml.eos_token_id u32              = 32000
llama_model_loader: - kv  19:            tokenizer.ggml.unknown_token_id u32              = 0
llama_model_loader: - kv  20:            tokenizer.ggml.padding_token_id u32              = 32000
llama_model_loader: - kv  21:               tokenizer.ggml.add_bos_token bool             = true
llama_model_loader: - kv  22:               tokenizer.ggml.add_eos_token bool             = false
llama_model_loader: - kv  23:                    tokenizer.chat_template str              = {{ bos_token }}{% for message in mess...
llama_model_loader: - kv  24:               general.quantization_version u32              = 2
llama_model_loader: - type  f32:   65 tensors
llama_model_loader: - type q4_K:  193 tensors
llama_model_loader: - type q6_K:   33 tensors
llm_load_vocab: special tokens definition check successful ( 323/32064 ).
llm_load_print_meta: format           = GGUF V3 (latest)
llm_load_print_meta: arch             = llama
llm_load_print_meta: vocab type       = SPM
llm_load_print_meta: n_vocab          = 32064
llm_load_print_meta: n_merges         = 0
llm_load_print_meta: n_ctx_train      = 4096
llm_load_print_meta: n_embd           = 3072
llm_load_print_meta: n_head           = 32
llm_load_print_meta: n_head_kv        = 32
llm_load_print_meta: n_layer          = 32
llm_load_print_meta: n_rot            = 96
llm_load_print_meta: n_embd_head_k    = 96
llm_load_print_meta: n_embd_head_v    = 96
llm_load_print_meta: n_gqa            = 1
llm_load_print_meta: n_embd_k_gqa     = 3072
llm_load_print_meta: n_embd_v_gqa     = 3072
llm_load_print_meta: f_norm_eps       = 0.0e+00
llm_load_print_meta: f_norm_rms_eps   = 1.0e-05
llm_load_print_meta: f_clamp_kqv      = 0.0e+00
llm_load_print_meta: f_max_alibi_bias = 0.0e+00
llm_load_print_meta: f_logit_scale    = 0.0e+00
llm_load_print_meta: n_ff             = 8192
llm_load_print_meta: n_expert         = 0
llm_load_print_meta: n_expert_used    = 0
llm_load_print_meta: causal attn      = 1
llm_load_print_meta: pooling type     = 0
llm_load_print_meta: rope type        = 0
llm_load_print_meta: rope scaling     = linear
llm_load_print_meta: freq_base_train  = 10000.0
llm_load_print_meta: freq_scale_train = 1
llm_load_print_meta: n_yarn_orig_ctx  = 4096
llm_load_print_meta: rope_finetuned   = unknown
llm_load_print_meta: ssm_d_conv       = 0
llm_load_print_meta: ssm_d_inner      = 0
llm_load_print_meta: ssm_d_state      = 0
llm_load_print_meta: ssm_dt_rank      = 0
llm_load_print_meta: model type       = 7B
llm_load_print_meta: model ftype      = Q4_K - Medium
llm_load_print_meta: model params     = 3.82 B
llm_load_print_meta: model size       = 2.16 GiB (4.85 BPW)
llm_load_print_meta: general.name     = LLaMA v2
llm_load_print_meta: BOS token        = 1 '<s>'
llm_load_print_meta: EOS token        = 32000 '<|endoftext|>'
llm_load_print_meta: UNK token        = 0 '<unk>'
llm_load_print_meta: PAD token        = 32000 '<|endoftext|>'
llm_load_print_meta: LF token         = 13 '<0x0A>'
llm_load_print_meta: EOT token        = 32007 '<|end|>'
llm_load_tensors: ggml ctx size =    0.30 MiB
ggml_backend_metal_buffer_from_ptr: allocated buffer, size =  2157.95 MiB, ( 2158.02 / 10922.67)
llm_load_tensors: offloading 32 repeating layers to GPU
llm_load_tensors: offloading non-repeating layers to GPU
llm_load_tensors: offloaded 33/33 layers to GPU
llm_load_tensors:        CPU buffer size =    52.84 MiB
llm_load_tensors:      Metal buffer size =  2157.95 MiB
.................................................................................................
llama_new_context_with_model: n_ctx      = 2048
llama_new_context_with_model: n_batch    = 512
llama_new_context_with_model: n_ubatch   = 512
llama_new_context_with_model: freq_base  = 10000.0
llama_new_context_with_model: freq_scale = 1
ggml_metal_init: allocating
ggml_metal_init: found device: Apple M1 Pro
ggml_metal_init: picking default device: Apple M1 Pro
ggml_metal_init: using embedded metal library
ggml_metal_init: GPU name:   Apple M1 Pro
ggml_metal_init: GPU family: MTLGPUFamilyApple7  (1007)
ggml_metal_init: GPU family: MTLGPUFamilyCommon3 (3003)
ggml_metal_init: GPU family: MTLGPUFamilyMetal3  (5001)
ggml_metal_init: simdgroup reduction support   = true
ggml_metal_init: simdgroup matrix mul. support = true
ggml_metal_init: hasUnifiedMemory              = true
ggml_metal_init: recommendedMaxWorkingSetSize  = 11453.25 MB
ggml_backend_metal_buffer_type_alloc_buffer: allocated buffer, size =   768.00 MiB, ( 2927.83 / 10922.67)
llama_kv_cache_init:      Metal KV buffer size =   768.00 MiB
llama_new_context_with_model: KV self size  =  768.00 MiB, K (f16):  384.00 MiB, V (f16):  384.00 MiB
llama_new_context_with_model:        CPU  output buffer size =     0.13 MiB
ggml_backend_metal_buffer_type_alloc_buffer: allocated buffer, size =   156.02 MiB, ( 3083.84 / 10922.67)
llama_new_context_with_model:      Metal compute buffer size =   156.00 MiB
llama_new_context_with_model:        CPU compute buffer size =    10.01 MiB
llama_new_context_with_model: graph nodes  = 1030
llama_new_context_with_model: graph splits = 2
{"function":"initialize","level":"INFO","line":448,"msg":"initializing slots","n_slots":1,"tid":"0x1f74a3ac0","timestamp":1715051308}
{"function":"initialize","level":"INFO","line":460,"msg":"new slot","n_ctx_slot":2048,"slot_id":0,"tid":"0x1f74a3ac0","timestamp":1715051308}
{"function":"main","level":"INFO","line":3067,"msg":"model loaded","tid":"0x1f74a3ac0","timestamp":1715051308}
{"function":"main","hostname":"127.0.0.1","level":"INFO","line":3270,"msg":"HTTP server listening","n_threads_http":"9","port":"59799","tid":"0x1f74a3ac0","timestamp":1715051308}
{"function":"update_slots","level":"INFO","line":1581,"msg":"all slots are idle and system prompt is empty, clear the KV cache","tid":"0x1f74a3ac0","timestamp":1715051308}
{"function":"process_single_task","level":"INFO","line":1513,"msg":"slot data","n_idle_slots":1,"n_processing_slots":0,"task_id":0,"tid":"0x1f74a3ac0","timestamp":1715051308}
{"function":"process_single_task","level":"INFO","line":1513,"msg":"slot data","n_idle_slots":1,"n_processing_slots":0,"task_id":1,"tid":"0x1f74a3ac0","timestamp":1715051308}
{"function":"log_server_request","level":"INFO","line":2744,"method":"GET","msg":"request","params":{},"path":"/health","remote_addr":"127.0.0.1","remote_port":59800,"status":200,"tid":"0x16fde7000","timestamp":1715051308}
{"function":"process_single_task","level":"INFO","line":1513,"msg":"slot data","n_idle_slots":1,"n_processing_slots":0,"task_id":2,"tid":"0x1f74a3ac0","timestamp":1715051308}
{"function":"log_server_request","level":"INFO","line":2744,"method":"GET","msg":"request","params":{},"path":"/health","remote_addr":"127.0.0.1","remote_port":59804,"status":200,"tid":"0x16fa9f000","timestamp":1715051308}
{"function":"process_single_task","level":"INFO","line":1513,"msg":"slot data","n_idle_slots":1,"n_processing_slots":0,"task_id":3,"tid":"0x1f74a3ac0","timestamp":1715051308}
{"function":"log_server_request","level":"INFO","line":2744,"method":"GET","msg":"request","params":{},"path":"/health","remote_addr":"127.0.0.1","remote_port":59801,"status":200,"tid":"0x16fe73000","timestamp":1715051308}
{"function":"process_single_task","level":"INFO","line":1513,"msg":"slot data","n_idle_slots":1,"n_processing_slots":0,"task_id":4,"tid":"0x1f74a3ac0","timestamp":1715051308}
{"function":"log_server_request","level":"INFO","line":2744,"method":"GET","msg":"request","params":{},"path":"/health","remote_addr":"127.0.0.1","remote_port":59803,"status":200,"tid":"0x16fbb7000","timestamp":1715051308}
{"function":"log_server_request","level":"INFO","line":2744,"method":"GET","msg":"request","params":{},"path":"/health","remote_addr":"127.0.0.1","remote_port":59802,"status":200,"tid":"0x16fa13000","timestamp":1715051308}
{"function":"process_single_task","level":"INFO","line":1513,"msg":"slot data","n_idle_slots":1,"n_processing_slots":0,"task_id":5,"tid":"0x1f74a3ac0","timestamp":1715051308}
{"function":"log_server_request","level":"INFO","line":2744,"method":"GET","msg":"request","params":{},"path":"/health","remote_addr":"127.0.0.1","remote_port":59807,"status":200,"tid":"0x16fccf000","timestamp":1715051308}
{"function":"process_single_task","level":"INFO","line":1513,"msg":"slot data","n_idle_slots":1,"n_processing_slots":0,"task_id":6,"tid":"0x1f74a3ac0","timestamp":1715051308}
{"function":"log_server_request","level":"INFO","line":2744,"method":"GET","msg":"request","params":{},"path":"/health","remote_addr":"127.0.0.1","remote_port":59807,"status":200,"tid":"0x16fccf000","timestamp":1715051308}
{"function":"log_server_request","level":"INFO","line":2744,"method":"POST","msg":"request","params":{},"path":"/tokenize","remote_addr":"127.0.0.1","remote_port":59807,"status":200,"tid":"0x16fccf000","timestamp":1715051308}
{"function":"process_single_task","level":"INFO","line":1513,"msg":"slot data","n_idle_slots":1,"n_processing_slots":0,"task_id":7,"tid":"0x1f74a3ac0","timestamp":1715051308}
{"function":"log_server_request","level":"INFO","line":2744,"method":"GET","msg":"request","params":{},"path":"/health","remote_addr":"127.0.0.1","remote_port":59807,"status":200,"tid":"0x16fccf000","timestamp":1715051308}
{"function":"log_server_request","level":"INFO","line":2744,"method":"POST","msg":"request","params":{},"path":"/tokenize","remote_addr":"127.0.0.1","remote_port":59807,"status":200,"tid":"0x16fccf000","timestamp":1715051308}
{"function":"process_single_task","level":"INFO","line":1513,"msg":"slot data","n_idle_slots":1,"n_processing_slots":0,"task_id":8,"tid":"0x1f74a3ac0","timestamp":1715051308}
{"function":"log_server_request","level":"INFO","line":2744,"method":"GET","msg":"request","params":{},"path":"/health","remote_addr":"127.0.0.1","remote_port":59808,"status":200,"tid":"0x16fd5b000","timestamp":1715051308}
{"function":"launch_slot_with_data","level":"INFO","line":833,"msg":"slot is processing task","slot_id":0,"task_id":9,"tid":"0x1f74a3ac0","timestamp":1715051308}
{"function":"update_slots","ga_i":0,"level":"INFO","line":1819,"msg":"slot progression","n_past":0,"n_past_se":0,"n_prompt_tokens_processed":55,"slot_id":0,"task_id":9,"tid":"0x1f74a3ac0","timestamp":1715051308}
{"function":"update_slots","level":"INFO","line":1843,"msg":"kv cache rm [p0, end)","p0":0,"slot_id":0,"task_id":9,"tid":"0x1f74a3ac0","timestamp":1715051308}
[GIN] 2024/05/07 - 13:08:30 | 200 |      10.292µs |    192.168.4.24 | HEAD     "/"
[GIN] 2024/05/07 - 13:08:31 | 200 |       21.75µs | 2403:5804:300:0:5c96:5280:db34:b3d5 | HEAD     "/"
{"function":"print_timings","level":"INFO","line":276,"msg":"prompt eval time     =     198.24 ms /    55 tokens (    3.60 ms per token,   277.44 tokens per second)","n_prompt_tokens_processed":55,"n_tokens_second":277.4428845988933,"slot_id":0,"t_prompt_processing":198.239,"t_token":3.6043454545454545,"task_id":9,"tid":"0x1f74a3ac0","timestamp":1715051315}
{"function":"print_timings","level":"INFO","line":290,"msg":"generation eval time =    5972.57 ms /   242 runs   (   24.68 ms per token,    40.52 tokens per second)","n_decoded":242,"n_tokens_second":40.518550380213014,"slot_id":0,"t_token":24.680053719008267,"t_token_generation":5972.573,"task_id":9,"tid":"0x1f74a3ac0","timestamp":1715051315}
{"function":"print_timings","level":"INFO","line":299,"msg":"          total time =    6170.81 ms","slot_id":0,"t_prompt_processing":198.239,"t_token_generation":5972.573,"t_total":6170.812,"task_id":9,"tid":"0x1f74a3ac0","timestamp":1715051315}
{"function":"update_slots","level":"INFO","line":1651,"msg":"slot released","n_cache_tokens":297,"n_ctx":2048,"n_past":296,"n_system_tokens":0,"slot_id":0,"task_id":9,"tid":"0x1f74a3ac0","timestamp":1715051315,"truncated":false}
{"function":"log_server_request","level":"INFO","line":2744,"method":"POST","msg":"request","params":{},"path":"/completion","remote_addr":"127.0.0.1","remote_port":59808,"status":200,"tid":"0x16fd5b000","timestamp":1715051315}
[GIN] 2024/05/07 - 13:08:35 | 200 |  7.859950166s | 2403:5804:300:0:5c96:5280:db34:b3d5 | POST     "/api/chat"
[GIN] 2024/05/07 - 13:08:35 | 200 |       8.542µs |    192.168.4.24 | HEAD     "/"
[GIN] 2024/05/07 - 13:08:36 | 200 |       14.25µs | 2403:5804:300:0:5c96:5280:db34:b3d5 | HEAD     "/"
[GIN] 2024/05/07 - 13:08:40 | 200 |         103µs |    192.168.4.24 | HEAD     "/"
[GIN] 2024/05/07 - 13:08:41 | 200 |      23.625µs | 2403:5804:300:0:5c96:5280:db34:b3d5 | HEAD     "/"
[GIN] 2024/05/07 - 13:08:45 | 200 |      11.458µs |    192.168.4.24 | HEAD     "/"
[GIN] 2024/05/07 - 13:08:46 | 200 |      44.583µs | 2403:5804:300:0:5c96:5280:db34:b3d5 | HEAD     "/"
[GIN] 2024/05/07 - 13:08:50 | 200 |       7.458µs |    192.168.4.24 | HEAD     "/"
[GIN] 2024/05/07 - 13:08:51 | 200 |       9.084µs | 2403:5804:300:0:5c96:5280:db34:b3d5 | HEAD     "/"
[GIN] 2024/05/07 - 13:08:55 | 200 |      24.792µs |    192.168.4.24 | HEAD     "/"
[GIN] 2024/05/07 - 13:08:56 | 200 |       8.667µs | 2403:5804:300:0:5c96:5280:db34:b3d5 | HEAD     "/"
[GIN] 2024/05/07 - 13:09:00 | 200 |       9.875µs |    192.168.4.24 | HEAD     "/"

cloudflared log

2024-05-07T03:08:26Z DBG HEAD https://ollama.notmyrealwebsite.com/ HTTP/1.1 connIndex=2 content-length=0 event=1 headers={"Accept":["*/*"],"Accept-Encoding":["gzip, br"],"Accept-Language":["en-AU;q=1.0"],"Authorization":["Bearer"],"Cdn-Loop":["cloudflare"],"Cf-Connecting-Ip":["2403:5804:300:0:5c96:5280:db34:b3d5"],"Cf-Ipcountry":["AU"],"Cf-Ray":["87fe01699953551b-SYD"],"Cf-Visitor":["{\"scheme\":\"https\"}"],"Cf-Warp-Tag-Id":["cbd96e62-87d6-4797-968e-11af89776d22"],"Content-Type":["application/json"],"User-Agent":["Enchanted/1.6.5 (subj.Enchanted; build:29; macOS 14.4.1) Alamofire/5.9.0"],"X-Forwarded-For":["2403:5804:300:0:5c96:5280:db34:b3d5"],"X-Forwarded-Proto":["https"]} host=ollama.notmyrealwebsite.com ingressRule=0 originService=http://192.168.4.23:11434 path=/
2024-05-07T03:08:26Z DBG 200 OK connIndex=2 content-length=17 event=1 ingressRule=0 originService=http://192.168.4.23:11434
2024-05-07T03:08:26Z DBG Failed to send ICMP reply error="funnel not found" dst=1.0.0.1:0
2024-05-07T03:08:27Z DBG Failed to parse ICMP reply, continue to parse as full packet error="expect ICMP echo, got neighbor advertisement" dst=[fe80::1c05:88d3:9f49:eb5d%en0]:0
2024-05-07T03:08:27Z DBG Failed to parse ICMP reply as full packet error="unknow ip version 8" dst=[fe80::1c05:88d3:9f49:eb5d%en0]:0
2024-05-07T03:08:27Z DBG HEAD https://ollama.notmyrealwebsite.com/ HTTP/1.1 connIndex=2 content-length=0 event=1 headers={"Accept":["*/*"],"Accept-Encoding":["gzip, br"],"Accept-Language":["en-AU;q=1.0"],"Authorization":["Bearer"],"Cdn-Loop":["cloudflare"],"Cf-Connecting-Ip":["2403:5804:300:0:5c96:5280:db34:b3d5"],"Cf-Ipcountry":["AU"],"Cf-Ray":["87fe016dbc43551b-SYD"],"Cf-Visitor":["{\"scheme\":\"https\"}"],"Cf-Warp-Tag-Id":["cbd96e62-87d6-4797-968e-11af89776d22"],"Content-Type":["application/json"],"User-Agent":["Enchanted/1.6.5 (subj.Enchanted; build:29; macOS 14.4.1) Alamofire/5.9.0"],"X-Forwarded-For":["2403:5804:300:0:5c96:5280:db34:b3d5"],"X-Forwarded-Proto":["https"]} host=ollama.notmyrealwebsite.com ingressRule=0 originService=http://192.168.4.23:11434 path=/
2024-05-07T03:08:27Z DBG 200 OK connIndex=2 content-length=17 event=1 ingressRule=0 originService=http://192.168.4.23:11434
2024-05-07T03:08:27Z DBG POST https://ollama.notmyrealwebsite.com/api/chat HTTP/1.1 connIndex=2 content-length=362 event=1 headers={"Accept":["*/*"],"Accept-Encoding":["gzip, br"],"Accept-Language":["en-AU;q=1.0"],"Authorization":["Bearer"],"Cdn-Loop":["cloudflare"],"Cf-Connecting-Ip":["2403:5804:300:0:5c96:5280:db34:b3d5"],"Cf-Ipcountry":["AU"],"Cf-Ray":["87fe016e3c7b551b-SYD"],"Cf-Visitor":["{\"scheme\":\"https\"}"],"Cf-Warp-Tag-Id":["cbd96e62-87d6-4797-968e-11af89776d22"],"Content-Length":["362"],"Content-Type":["application/json"],"User-Agent":["Enchanted/1.6.5 (subj.Enchanted; build:29; macOS 14.4.1) Alamofire/5.9.0"],"X-Forwarded-For":["2403:5804:300:0:5c96:5280:db34:b3d5"],"X-Forwarded-Proto":["https"]} host=ollama.notmyrealwebsite.com ingressRule=0 originService=http://192.168.4.23:11434 path=/api/chat
2024-05-07T03:08:29Z DBG Failed to send ICMP reply error="funnel not found" dst=1.0.0.1:0
2024-05-07T03:08:31Z DBG Failed to send ICMP reply error="funnel not found" dst=1.0.0.1:0
2024-05-07T03:08:31Z DBG HEAD https://ollama.notmyrealwebsite.com/ HTTP/1.1 connIndex=2 content-length=0 event=1 headers={"Accept":["*/*"],"Accept-Encoding":["gzip, br"],"Accept-Language":["en-AU;q=1.0"],"Authorization":["Bearer"],"Cdn-Loop":["cloudflare"],"Cf-Connecting-Ip":["2403:5804:300:0:5c96:5280:db34:b3d5"],"Cf-Ipcountry":["AU"],"Cf-Ray":["87fe0188dee7551b-SYD"],"Cf-Visitor":["{\"scheme\":\"https\"}"],"Cf-Warp-Tag-Id":["cbd96e62-87d6-4797-968e-11af89776d22"],"Content-Type":["application/json"],"User-Agent":["Enchanted/1.6.5 (subj.Enchanted; build:29; macOS 14.4.1) Alamofire/5.9.0"],"X-Forwarded-For":["2403:5804:300:0:5c96:5280:db34:b3d5"],"X-Forwarded-Proto":["https"]} host=ollama.notmyrealwebsite.com ingressRule=0 originService=http://192.168.4.23:11434 path=/
2024-05-07T03:08:31Z DBG 200 OK connIndex=2 content-length=17 event=1 ingressRule=0 originService=http://192.168.4.23:11434
2024-05-07T03:08:33Z DBG Failed to send ICMP reply error="funnel not found" dst=1.0.0.1:0
2024-05-07T03:08:35Z DBG 200 OK connIndex=2 content-length=-1 event=1 ingressRule=0 originService=http://192.168.4.23:11434
2024-05-07T03:08:35Z DBG Failed to send ICMP reply error="funnel not found" dst=1.0.0.1:0
2024-05-07T03:08:36Z DBG HEAD https://ollama.notmyrealwebsite.com/ HTTP/1.1 connIndex=2 content-length=0 event=1 headers={"Accept":["*/*"],"Accept-Encoding":["gzip, br"],"Accept-Language":["en-AU;q=1.0"],"Authorization":["Bearer"],"Cdn-Loop":["cloudflare"],"Cf-Connecting-Ip":["2403:5804:300:0:5c96:5280:db34:b3d5"],"Cf-Ipcountry":["AU"],"Cf-Ray":["87fe01a81f38551b-SYD"],"Cf-Visitor":["{\"scheme\":\"https\"}"],"Cf-Warp-Tag-Id":["cbd96e62-87d6-4797-968e-11af89776d22"],"Content-Type":["application/json"],"User-Agent":["Enchanted/1.6.5 (subj.Enchanted; build:29; macOS 14.4.1) Alamofire/5.9.0"],"X-Forwarded-For":["2403:5804:300:0:5c96:5280:db34:b3d5"],"X-Forwarded-Proto":["https"]} host=ollama.notmyrealwebsite.com ingressRule=0 originService=http://192.168.4.23:11434 path=/
2024-05-07T03:08:36Z DBG 200 OK connIndex=2 content-length=17 event=1 ingressRule=0 originService=http://192.168.4.23:11434
2024-05-07T03:08:37Z DBG Failed to send ICMP reply error="funnel not found" dst=1.0.0.1:0
2024-05-07T03:08:39Z DBG Failed to send ICMP reply error="funnel not found" dst=1.0.0.1:0
2024-05-07T03:08:41Z DBG Failed to send ICMP reply error="funnel not found" dst=1.0.0.1:0
2024-05-07T03:08:41Z DBG HEAD https://ollama.notmyrealwebsite.com/ HTTP/1.1 connIndex=2 content-length=0 event=1 headers={"Accept":["*/*"],"Accept-Encoding":["gzip, br"],"Accept-Language":["en-AU;q=1.0"],"Authorization":["Bearer"],"Cdn-Loop":["cloudflare"],"Cf-Connecting-Ip":["2403:5804:300:0:5c96:5280:db34:b3d5"],"Cf-Ipcountry":["AU"],"Cf-Ray":["87fe01c75fdc551b-SYD"],"Cf-Visitor":["{\"scheme\":\"https\"}"],"Cf-Warp-Tag-Id":["cbd96e62-87d6-4797-968e-11af89776d22"],"Content-Type":["application/json"],"User-Agent":["Enchanted/1.6.5 (subj.Enchanted; build:29; macOS 14.4.1) Alamofire/5.9.0"],"X-Forwarded-For":["2403:5804:300:0:5c96:5280:db34:b3d5"],"X-Forwarded-Proto":["https"]} host=ollama.notmyrealwebsite.com ingressRule=0 originService=http://192.168.4.23:11434 path=/
2024-05-07T03:08:41Z DBG 200 OK connIndex=2 content-length=17 event=1 ingressRule=0 originService=http://192.168.4.23:11434
2024-05-07T03:08:43Z DBG Failed to send ICMP reply error="funnel not found" dst=1.0.0.1:0
2024-05-07T03:08:45Z DBG Failed to send ICMP reply error="funnel not found" dst=1.0.0.1:0
2024-05-07T03:08:46Z DBG HEAD https://ollama.notmyrealwebsite.com/ HTTP/1.1 connIndex=2 content-length=0 event=1 headers={"Accept":["*/*"],"Accept-Encoding":["gzip, br"],"Accept-Language":["en-AU;q=1.0"],"Authorization":["Bearer"],"Cdn-Loop":["cloudflare"],"Cf-Connecting-Ip":["2403:5804:300:0:5c96:5280:db34:b3d5"],"Cf-Ipcountry":["AU"],"Cf-Ray":["87fe01e69dca551b-SYD"],"Cf-Visitor":["{\"scheme\":\"https\"}"],"Cf-Warp-Tag-Id":["cbd96e62-87d6-4797-968e-11af89776d22"],"Content-Type":["application/json"],"User-Agent":["Enchanted/1.6.5 (subj.Enchanted; build:29; macOS 14.4.1) Alamofire/5.9.0"],"X-Forwarded-For":["2403:5804:300:0:5c96:5280:db34:b3d5"],"X-Forwarded-Proto":["https"]} host=ollama.notmyrealwebsite.com ingressRule=0 originService=http://192.168.4.23:11434 path=/
2024-05-07T03:08:46Z DBG 200 OK connIndex=2 content-length=17 event=1 ingressRule=0 originService=http://192.168.4.23:11434
2024-05-07T03:08:46Z DBG Failed to parse ICMP reply, continue to parse as full packet error="expect ICMP echo, got neighbor solicitation" dst=[fe80::4e:2d8c:4657:840e%en0]:0
2024-05-07T03:08:46Z DBG Failed to parse ICMP reply as full packet error="unknow ip version 8" dst=[fe80::4e:2d8c:4657:840e%en0]:0
2024-05-07T03:08:47Z DBG Failed to send ICMP reply error="funnel not found" dst=1.0.0.1:0
2024-05-07T03:08:48Z DBG Failed to parse ICMP reply, continue to parse as full packet error="expect ICMP echo, got neighbor solicitation" dst=[fe80::18d1:2ced:d5db:6730%en0]:0
2024-05-07T03:08:48Z DBG Failed to parse ICMP reply as full packet error="unknow ip version 8" dst=[fe80::18d1:2ced:d5db:6730%en0]:0
2024-05-07T03:08:49Z DBG Failed to send ICMP reply error="funnel not found" dst=1.0.0.1:0
2024-05-07T03:08:51Z DBG Failed to parse ICMP reply, continue to parse as full packet error="expect ICMP echo, got neighbor solicitation" dst=[fe80::4a4:6e44:a87d:67ba%en0]:0
2024-05-07T03:08:51Z DBG Failed to parse ICMP reply as full packet error="unknow ip version 8" dst=[fe80::4a4:6e44:a87d:67ba%en0]:0
2024-05-07T03:08:51Z DBG HEAD https://ollama.notmyrealwebsite.com/ HTTP/1.1 connIndex=2 content-length=0 event=1 headers={"Accept":["*/*"],"Accept-Encoding":["gzip, br"],"Accept-Language":["en-AU;q=1.0"],"Authorization":["Bearer"],"Cdn-Loop":["cloudflare"],"Cf-Connecting-Ip":["2403:5804:300:0:5c96:5280:db34:b3d5"],"Cf-Ipcountry":["AU"],"Cf-Ray":["87fe0205da0e551b-SYD"],"Cf-Visitor":["{\"scheme\":\"https\"}"],"Cf-Warp-Tag-Id":["cbd96e62-87d6-4797-968e-11af89776d22"],"Content-Type":["application/json"],"User-Agent":["Enchanted/1.6.5 (subj.Enchanted; build:29; macOS 14.4.1) Alamofire/5.9.0"],"X-Forwarded-For":["2403:5804:300:0:5c96:5280:db34:b3d5"],"X-Forwarded-Proto":["https"]} host=ollama.notmyrealwebsite.com ingressRule=0 originService=http://192.168.4.23:11434 path=/
2024-05-07T03:08:51Z DBG 200 OK connIndex=2 content-length=17 event=1 ingressRule=0 originService=http://192.168.4.23:11434
2024-05-07T03:08:51Z DBG Failed to send ICMP reply error="funnel not found" dst=1.0.0.1:0
2024-05-07T03:08:52Z DBG Failed to parse ICMP reply, continue to parse as full packet error="expect ICMP echo, got neighbor advertisement" dst=[fe80::4e:2d8c:4657:840e%en0]:0
2024-05-07T03:08:52Z DBG Failed to parse ICMP reply as full packet error="unknow ip version 8" dst=[fe80::4e:2d8c:4657:840e%en0]:0
2024-05-07T03:08:53Z DBG Failed to send ICMP reply error="funnel not found" dst=1.0.0.1:0
2024-05-07T03:08:54Z DBG Failed to parse ICMP reply, continue to parse as full packet error="expect ICMP echo, got neighbor advertisement" dst=[fe80::18d1:2ced:d5db:6730%en0]:0
2024-05-07T03:08:54Z DBG Failed to parse ICMP reply as full packet error="unknow ip version 8" dst=[fe80::18d1:2ced:d5db:6730%en0]:0
2024-05-07T03:08:55Z DBG Failed to send ICMP reply error="funnel not found" dst=1.0.0.1:0
2024-05-07T03:08:56Z DBG Failed to parse ICMP reply, continue to parse as full packet error="expect ICMP echo, got neighbor advertisement" dst=[fe80::9ea5:70ff:fe72:148d%en0]:0
2024-05-07T03:08:56Z DBG Failed to parse ICMP reply as full packet error="unknow ip version 8" dst=[fe80::9ea5:70ff:fe72:148d%en0]:0
2024-05-07T03:08:56Z DBG Failed to parse ICMP reply, continue to parse as full packet error="expect ICMP echo, got neighbor advertisement" dst=[fe80::4a4:6e44:a87d:67ba%en0]:0
2024-05-07T03:08:56Z DBG Failed to parse ICMP reply as full packet error="unknow ip version 8" dst=[fe80::4a4:6e44:a87d:67ba%en0]:0
2024-05-07T03:08:56Z DBG HEAD https://ollama.notmyrealwebsite.com/ HTTP/1.1 connIndex=2 content-length=0 event=1 headers={"Accept":["*/*"],"Accept-Encoding":["gzip, br"],"Accept-Language":["en-AU;q=1.0"],"Authorization":["Bearer"],"Cdn-Loop":["cloudflare"],"Cf-Connecting-Ip":["2403:5804:300:0:5c96:5280:db34:b3d5"],"Cf-Ipcountry":["AU"],"Cf-Ray":["87fe02251ee5551b-SYD"],"Cf-Visitor":["{\"scheme\":\"https\"}"],"Cf-Warp-Tag-Id":["cbd96e62-87d6-4797-968e-11af89776d22"],"Content-Type":["application/json"],"User-Agent":["Enchanted/1.6.5 (subj.Enchanted; build:29; macOS 14.4.1) Alamofire/5.9.0"],"X-Forwarded-For":["2403:5804:300:0:5c96:5280:db34:b3d5"],"X-Forwarded-Proto":["https"]} host=ollama.notmyrealwebsite.com ingressRule=0 originService=http://192.168.4.23:11434 path=/
2024-05-07T03:08:56Z DBG 200 OK connIndex=2 content-length=17 event=1 ingressRule=0 originService=http://192.168.4.23:11434
2024-05-07T03:08:57Z DBG Failed to send ICMP reply error="funnel not found" dst=1.0.0.1:0
2024-05-07T03:09:00Z DBG Failed to send ICMP reply error="funnel not found" dst=1.0.0.1:0

Looks better than it was, has failed multiple times other than this but without correlated data at hand.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants