Am I limited to only specific partners on the Log Stream?

End Goal: I’m trying to figure out how to get logs from render to show up in Grafana cloud.

AFAIK I need to have a syslog sink that forwards logs to a Loki client that converts the logs to a format understood by Grafana.

I set up a private service syslog-ng sink and it starts fine, but the “Log Endpoint” in my Log Stream config does not accept the internal service hostname.

Setting up the syslog-ng as a web service exposes normal web ports (443/80) but I am wary of exposing a syslog sink directly on those ports without some sort of mutual TLS. Unfortunately, it appears that log streams do not allow me to set a client cert.

Am I limited to only specific partners on the Log Stream?

Adam,
We’ve been discussing this internally and agree that it’s an interesting use case and we suspect that we have validation that prevent the use of internal addresses to the log stream endpoint. I’ll report back when I have more info.

John B

Thanks John!

I look forward to learning more. Also, if there is a better or more advisable way to accomplish my end goal, I am very open to changing my approach.

Hi @John_B,

Just following up, I know you are probably very busy. Just a friendly bump to see if you have learned more, but please feel free to ignore if you have not yet.

Thank you for your time!

Adam

We don’t have any immediate solution here - we’d not considered customers wanting to log stream to services inside Render so we’ve put it on the feature request list for now,

Regards,

John B

Not sure if your using Java but this is one way.

I would avoid the log stream and either try the above or running a Loki agent on your instance that can handle aggregating the logs.

we’d not considered customers wanting to log stream to services inside Render so we’ve put it on the feature request list for now,

Fair enough, thank you for the context :slight_smile:

Not sure if your using Java but this is one way.

Fabulous idea! This particular service is ruby, but the concept is applicable and works great! Thank you!

1 Like

For those who may come later. The solution is very similar to the log4j solution that @trollbearpig recommended. You can leverage semantic_logger in ruby and write a simple custom appender that extends SemanticLogger::Appender::Http.

Note: Loki’s api expects you to use nanoseconds for the log entries and so you can build a small formatter as well.

Below are rough snippets for reference only:

formatter

def call(log, logger)
      raw = super(log, logger)
      [((log.time.to_i * (10**9)) + log.time.nsec).to_s, raw.to_json]
    end

appender

def log(log)
      message = formatter.call(log, self)
      data = data(message)
      post(data)
    end

    # Logs in batches
    def batch(logs)
      logs.map! { |log| formatter.call(log, self) }
      data = data(logs)

      post(data)
    end

    private

    def data(logs)
      {
        streams: [
          {
            stream: stream,
            values: Array(logs)
          }
        ]
      }.to_json
    end
1 Like